Software update with new split EVs

k_c

Well-Known Member
icnt-

A composition of {0,0,0,0,0,0,0,0,0,16} (all tens) reveals a problem.

10-10 v 10
Obviously SPL1 = SPL2 = SPL3 = 0.

Your program computes SPL1 = 0 but computes SPL2 and SPL3 = -1.#IND, so something is wrong there.
 

iCountNTrack

Well-Known Member
k_c said:
icnt-

A composition of {0,0,0,0,0,0,0,0,0,16} (all tens) reveals a problem.

10-10 v 10
Obviously SPL1 = SPL2 = SPL3 = 0.

Your program computes SPL1 = 0 but computes SPL2 and SPL3 = -1.#IND, so something is wrong there.
Thanks for pointing that out, this is due to the fact that I had

Code:
if(splitEV > hitEV && splitEV > standEV && splitEV > doubleEV)
but in this case splitEV>standEV would not be satisfied leading to some unpredictable results... This will also be true for compositions where hitting is the same thing as standing, where the CA gets confused when there is no optimum decision.

I was hoping nobody would need a CA for a deck made only of tens :grin:

fix is simple
change the conditional statement to
Code:
if(splitEV > hitEV && splitEV >= standEV && splitEV > doubleEV)
 

k_c

Well-Known Member
iCountNTrack said:
I found a couple of small bugs (with big effect) it was tough to find. I guess that is why they say to stay away from deep nesting

http://code.google.com/p/blackjack-combinatorial-analyzer/downloads/list

If i broke something again please let me know.
A composition of {1,1,1,1,1,1,1,1,1,12} (10-10 v 10) seems to work for 1 split, but fails with resplits

It appears you are using ENHC. My program values:
SPL1: -.03212
SPL2: -.3285
SPL3: -.5867

Your program values:
SPL1: -0.032019634777
SPL2: -0.032019634777
SPL3: -0.032019634777

It looks like you are only computing 1 split, but the value seems to be reasonable since your EV is more optimal.

One of the problems with resplits is to keep track of when a pair can and cannot be split. If 3 splits are allowed after completing each split path of the first split hand there can be 0, 1, or 2 splits remaining for the second hand so it's easy to compute the wrong value somewhere along the line.

One thing you might do to check whether the right number of hands are being computed is to check how many split hands are expected. In order to do that all you need to do is sum the probabilities of each hand occurrence. Below is a quick and dirty program that computes expected split hands as a function of number of pair cards, number of non-pair cards, and number of allowed splits. If your sum of probabilities of hand occurences matches the output of the program at least you'll know the number of calculations is probably right.

Expected hands for 1 split is always 2. For resplits expected hands >= 2 depending upon the given parameters.
For the above shoe composition the pair = 10 and dealer up card = 10. After the hand is dealt there are 9 tens and 9 non-tens remaining so number of pair cards = 9, number of non-pair cards = 9, and number of allowed splits = 3. Inputting this data computes an expected number of hands = 3.40588. Most likely you will find that the sum of occurrences of probabilities for 2 or 3 splits in your program for this shoe will equal 2.000, since it appears you are just computing SPL1.

Code:
#include <iostream>
#include <iomanip>

using namespace std;

double getSpHands(const long &P, const long &NP, const short &pCards, const short &remSp)
{
	if (P == 0 || remSp == 0)
		return double(pCards);

	long tot;
	double pP, hands;

	tot = P + NP;
	pP = double(P) / tot;

	if (pCards >= 2)
        hands = pP * getSpHands(P - 1, NP, pCards + 1, remSp - 1)
              + (1 - pP) * (getSpHands(P, NP - 1, pCards - 1, remSp) + 1);
    	else // if (pCards == 1)
        	hands = (1 - pP) + pP * getSpHands(P - 1, NP, 2, remSp - 1);

	return hands;
}

int main()
{
	long P, NP;
	cout << "Pair cards in shoe after pair and dealer up card have been dealt:  ";
	cin >> P;

	cout << "Non-pair cards in shoe after pair and dealer up card have been dealt:  ";
	cin >> NP;

	short allowedSplits = 1;
	cout << "Input number of splits allowed:  ";
	cin >> allowedSplits;

	short remSp = allowedSplits - 1;
	double expectedHands = getSpHands(P, NP, 2, remSp);

	cout << "\nExpected hands:  " << expectedHands << "\n";

	return 0;
}
 
k_c said:
A composition of {1,1,1,1,1,1,1,1,1,12} (10-10 v 10) seems to work for 1 split, but fails with resplits

It appears you are using ENHC. My program values:
SPL1: -.03212
SPL2: -.3285
SPL3: -.5867

Your program values:
SPL1: -0.032019634777
SPL2: -0.032019634777
SPL3: -0.032019634777

It looks like you are only computing 1 split, but the value seems to be reasonable since your EV is more optimal.

One of the problems with resplits is to keep track of when a pair can and cannot be split. If 3 splits are allowed after completing each split path of the first split hand there can be 0, 1, or 2 splits remaining for the second hand so it's easy to compute the wrong value somewhere along the line.

One thing you might do to check whether the right number of hands are being computed is to check how many split hands are expected. In order to do that all you need to do is sum the probabilities of each hand occurrence. Below is a quick and dirty program that computes expected split hands as a function of number of pair cards, number of non-pair cards, and number of allowed splits. If your sum of probabilities of hand occurences matches the output of the program at least you'll know the number of calculations is probably right.
Although I do not know for sure that iCountNTrack's values are correct, remember that he is computing optimal strategy (i.e., perfect for the round), which may entail making a decision not to re-split at some point. We, on the other hand, assume that once we initially split a pair, we also re-split at every subsequent opportunity.

The values above are consistent with these two approaches: continuing to resplit makes things worse, and ICNT recognizes this and does not continue to re-split pairs even when allowed.
 

k_c

Well-Known Member
ericfarmer said:
Although I do not know for sure that iCountNTrack's values are correct, remember that he is computing optimal strategy (i.e., perfect for the round), which may entail making a decision not to re-split at some point. We, on the other hand, assume that once we initially split a pair, we also re-split at every subsequent opportunity.

The values above are consistent with these two approaches: continuing to resplit makes things worse, and ICNT recognizes this and does not continue to re-split pairs even when allowed.
I didn't think of that.

I have an unfinished newer version of my program that only computes 1 split when splitting isn't the best option.

Maybe iCountNTrack can comment on whether this is how his program works.
 

iCountNTrack

Well-Known Member
ericfarmer said:
Although I do not know for sure that iCountNTrack's values are correct, remember that he is computing optimal strategy (i.e., perfect for the round), which may entail making a decision not to re-split at some point. We, on the other hand, assume that once we initially split a pair, we also re-split at every subsequent opportunity.

The values above are consistent with these two approaches: continuing to resplit makes things worse, and ICNT recognizes this and does not continue to re-split pairs even when allowed.
k_c said:
I didn't think of that.

I have an unfinished newer version of my program that only computes 1 split when splitting isn't the best option.

Maybe iCountNTrack can comment on whether this is how his program works.
Eric pretty much summed it up. When you are optimal post-split strategy you look at the optimal decision, clearly resplitting lowers your ev, that is why you will only split once if playing optimally, even though you are given the option of splitting once or twice.

If on the other hand we look at
{1,1,1,1,1,1,1,1,4,12}
and look at 9,9 vs 6
resplitting would increase your optimal and would be the correct play


1 split
1.21266155585 (ICNT)
1.21266159057 (K_C)
1.21266155585 (MGP)
1.21266155585 (Eric)

2 splits
1.34482460528 (ICNT)
1.34482467651 (K_C)
1.34482460528 (MGP)
N/A (Eric)

3 splits
1.36141078654 (ICNT)
1.36141082763 (K_C)
1.36141078654 (MGP)
1.36141078654 (Eric)


P.S: K_C, i suspect that your double built is still outputting values using singles I am not sure why.
 

k_c

Well-Known Member
iCountNTrack said:
Eric pretty much summed it up. When you are optimal post-split strategy you look at the optimal decision, clearly resplitting lowers your ev, that is why you will only split once if playing optimally, even though you are given the option of splitting once or twice.

If on the other hand we look at
{1,1,1,1,1,1,1,1,4,12}
and look at 9,9 vs 6
resplitting would increase your optimal and would be the correct play


1 split
1.21266155585 (ICNT)
1.21266159057 (K_C)
1.21266155585 (MGP)
1.21266155585 (Eric)

2 splits
1.34482460528 (ICNT)
1.34482467651 (K_C)
1.34482460528 (MGP)
N/A (Eric)

3 splits
1.36141078654 (ICNT)
1.36141082763 (K_C)
1.36141078654 (MGP)
1.36141078654 (Eric)


P.S: K_C, i suspect that your double built is still outputting values using singles I am not sure why.
There's either an anomaly in the .dll or in Igor. These are the values I get for this composition in my desktop version using doubles. They seem to be in agreement with everyone else.

SPL1 1.2126615558504099
SPL2 1.3448246052807087
SPL3 1.3614107865368370

I'll check what I get using the .dll from within C++ and get back to you.

Edit:
I just ran the .dll inside a C++ program and it output identically to my desktop program so I think the error is somewhere within Igor.

SPL1 1.2126615558504099
SPL2 1.3448246052807087
SPL3 1.3614107865368370
 

iCountNTrack

Well-Known Member
k_c said:
There's either an anomaly in the .dll or in Igor. These are the values I get for this composition in my desktop version using doubles. They seem to be in agreement with everyone else.

SPL1 1.2126615558504099
SPL2 1.3448246052807087
SPL3 1.3614107865368370

I'll check what I get using the .dll from within C++ and get back to you.

Edit:
I just ran the .dll inside a C++ program and it output identically to my desktop program so I think the error is somewhere within Igor.

SPL1 1.2126615558504099
SPL2 1.3448246052807087
SPL3 1.3614107865368370
It must be something with either the XOP or the procedure file. I will take a look.

I am trying to create an XOP for IGOR. I might need your help on this one :grin:
 

k_c

Well-Known Member
iCountNTrack said:
It must be something with either the XOP or the procedure file. I will take a look.

I am trying to create an XOP for IGOR. I might need your help on this one :grin:
I'm not very good with Igor but I managed to compute the shoe composition you listed along with the required rules and display EVs for 9-9 v 6 to 15 decimal places.

The .dll and .xop appear to be functioning as well as could be expected.

•getHandData()
Player example hand 9-9 (EV in percent)
SAMPLE DATA (versus 6 - could be versus any up card)
Stand versus 6, 48.801579111176594
Hit versus 6, -74.646419971497366
Double versus 6, -149.292839942994732
Split 1 versus 6, 121.266155585040991
Split 2 versus 6, 134.482460528070874
Split 3 versus 6, 136.141078653683707
Strategy versus 6: p
EV in percent versus all up cards, 10.733566348743985
 

k_c

Well-Known Member
MGP said:
It's not exactly fixed. To calculate CDP do the following:

1) Calculate the CD strategy for a full shoe.
2) Calculate the CD strategy with 1 P card removed
3) Repeat 2 for up to 3 P cards removed.
4) For each shoe state use the appropriate strategy with the correct number of paircards removed. So for example the strategy for -PNNN is potentially different than that for -PPxxx but would be the same as for -NPNN
If instead of simply fixing split strategy for all split hands as the optimal strategy of the first split each individual split pair card is computed optimally then strategy is no longer fixed for all split hands. As far as I can see it would be fixed for each number of pair cards removed, though. For example if 2 pair cards are drawn there are no more splits remaining so all 4 hands would be played with the same strategy. If one pair card and 2 non-pair cards are drawn then EV is relative to one pair card removed. This is more optimal because more information is being considered.

My first inclination was to compute splits this way. I didn't because it isn't consistent with using a fixed strategy for all split hands. It's actually easier than using a fixed strategy because split data from the first split hand can be discarded rather than being saved for reference. There isn't any difference in SPL1 values. The only differences occur for resplits.

Does this seem correct?

2-2 v 7, single deck, DAS

CDP1 (fixed strategy)
SPL1 0.0063296898639950416
SPL2 0.015445169178526293
SPL3 0.015986968729175405

CDPN? (semi-fixed strategy according to pair cards removed)
SPL1 0.0063296898639950416
SPL2 0.015528319306292926
SPL3 0.016088539987718260
 

k_c

Well-Known Member
k_c said:
If instead of simply fixing split strategy for all split hands as the optimal strategy of the first split each individual split pair card is computed optimally then strategy is no longer fixed for all split hands. As far as I can see it would be fixed for each number of pair cards removed, though. For example if 2 pair cards are drawn there are no more splits remaining so all 4 hands would be played with the same strategy. If one pair card and 2 non-pair cards are drawn then EV is relative to one pair card removed. This is more optimal because more information is being considered.

My first inclination was to compute splits this way. I didn't because it isn't consistent with using a fixed strategy for all split hands. It's actually easier than using a fixed strategy because split data from the first split hand can be discarded rather than being saved for reference. There isn't any difference in SPL1 values. The only differences occur for resplits.

Does this seem correct?

2-2 v 7, single deck, DAS

CDP1 (fixed strategy)
SPL1 0.0063296898639950416
SPL2 0.015445169178526293
SPL3 0.015986968729175405

CDPN? (semi-fixed strategy according to pair cards removed)
SPL1 0.0063296898639950416
SPL2 0.015528319306292926
SPL3 0.016088539987718260
Checking MGP's CA shows that my second set of EVs match what he defines as CDP. Evidently CDPN is a still a little more optimal yet.
 

MGP

Well-Known Member
If you look at how I calculate splits. CDPN not only removes P cards but also N cards before calculating the optimal strategy. It requires the use of burn card calcs.
 

k_c

Well-Known Member
Comparison of CDP1 and CDP

I input a composition of {1,1,1,1,1,1,1,1,1,12} (1-10) and computed 3 splits, DAS, player hand of 10-10 for CDP1 and CDP. I was expecting to see either equal or more optimal EVs for CDP but it appears this isn't the case. I can't explain why unless maybe all splits need to be played the same for a reliable EV.

Up cards of 2-6 have the same CDP and CDP1 split EVs for SPL1, SPL2, SPL3.

Up card 7
CDP1 SPL1 = 88.18%, SPL2 = 89.00%, SPL3 = 88.12%
CDP SPL1 = 88.18%, SPL2 = 88.88%, SPL3 = 87.82%

Up card 8
CDP1 SPL1 = 71.87%, SPL2 = 58.21%, SPL3 = 44.40%
CDP SPL1 = 71.87%, SPL2 = 58.18%, SPL3 = 44.31%

Up card 9
CDP1 SPL1 = 57.39%, SPL2 = 33.45%, SPL3 = 10.38%
CDP SPL1 = 57.39%, SPL2 = 33.43%, SPL3 = 10.27%

Up card 10 (dealer has checked for BJ)
CDP1 SPL1 = 8.364%, SPL2 = -18.35%, SPL3 = -41.64%
CDP SPL1 = 8.364%, SPL2 = -18.40%, SPL3 = -41.78%

Up card Ace (dealer has checked for BJ)
CDP1 SPL1 = 70.53%, SPL2 = 64.72%, SPL3 = 57.18%
CDP SPL1 = 70.53%, SPL2 = 64.72%, SPL3 = 57.45%

For this composition the only case where CDP is more optimal than CDP1 is for up card of ace. Hopefully my algorithm's OK. It seems to perfectly match MGP's values for CDP, 2-2 v 7, single deck, DAS.

CDP1 is fixed for all split hands whereas CDP is fixed depending upon pair cards removed from the starting composition.
 

iCountNTrack

Well-Known Member
k_c said:
I input a composition of {1,1,1,1,1,1,1,1,1,12} (1-10) and computed 3 splits, DAS, player hand of 10-10 for CDP1 and CDP. I was expecting to see either equal or more optimal EVs for CDP but it appears this isn't the case. I can't explain why unless maybe all splits need to be played the same for a reliable EV.

Up cards of 2-6 have the same CDP and CDP1 split EVs for SPL1, SPL2, SPL3.

Up card 7
CDP1 SPL1 = 88.18%, SPL2 = 89.00%, SPL3 = 88.12%
CDP SPL1 = 88.18%, SPL2 = 88.88%, SPL3 = 87.82%


Up card 8
CDP1 SPL1 = 71.87%, SPL2 = 58.21%, SPL3 = 44.40%
CDP SPL1 = 71.87%, SPL2 = 58.18%, SPL3 = 44.31%

Up card 9
CDP1 SPL1 = 57.39%, SPL2 = 33.45%, SPL3 = 10.38%
CDP SPL1 = 57.39%, SPL2 = 33.43%, SPL3 = 10.27%

Up card 10 (dealer has checked for BJ)
CDP1 SPL1 = 8.364%, SPL2 = -18.35%, SPL3 = -41.64%
CDP SPL1 = 8.364%, SPL2 = -18.40%, SPL3 = -41.78%

Up card Ace (dealer has checked for BJ)
CDP1 SPL1 = 70.53%, SPL2 = 64.72%, SPL3 = 57.18%
CDP SPL1 = 70.53%, SPL2 = 64.72%, SPL3 = 57.45%

For this composition the only case where CDP is more optimal than CDP1 is for up card of ace. Hopefully my algorithm's OK. It seems to perfectly match MGP's values for CDP, 2-2 v 7, single deck, DAS.

CDP1 is fixed for all split hands whereas CDP is fixed depending upon pair cards removed from the starting composition.
the bolded splits value for 10,10 vs are odd with CDP and CDP1 are interesting
split1_EV< split3_EV < split2EV

Values from my CA show a steady increase going from split1 to split3

split1_EV (0.882351045831) < split2_EV (0.891701820669) < split3_EV (0.898974814447 )
The only explanation I can come up with is that playing optimally is more important for split3 than split2, since the deck is more depleted and gain from strategy variations will be higher
 

k_c

Well-Known Member
iCountNTrack said:
the bolded splits value for 10,10 vs are odd with CDP and CDP1 are interesting
split1_EV< split3_EV < split2EV

Values from my CA show a steady increase going from split1 to split3

split1_EV (0.882351045831) < split2_EV (0.891701820669) < split3_EV (0.898974814447 )
The only explanation I can come up with is that playing optimally is more important for split3 than split2, since the deck is more depleted and gain from strategy variations will be higher
Your program's EVs for the input composition/rules, 10-10 v 7 are a little different using version 7 of your program as compared to version 8.

However, these EVs are greater than either CDP1 or CDP in every case. This is understandable since you are computing with a more optimal method so your EVs should be >= CDP1 or CDP.

By the same token CDP EVs should always be >= CDP1 EVs but surprisingly they are not.

I was getting ready to add CDP as a more optimal computing option to my CA but first I thought I'd try a few limited compositions to see how it compares to my present most optimal way of computing splits. The very first composition I tried was {1,1,1,1,1,1,1,1,1,12} with the results I posted.

I checked the latest version of Eric's CA in which CDP is an option and get the same EVs with my CDP algorithm as he does so that seems to confirm the EVs I posted are right. I don't think I'm going to change my CA to include CDP after all.
 
k_c said:
I input a composition of {1,1,1,1,1,1,1,1,1,12} (1-10) and computed 3 splits, DAS, player hand of 10-10 for CDP1 and CDP. I was expecting to see either equal or more optimal EVs for CDP but it appears this isn't the case. I can't explain why unless maybe all splits need to be played the same for a reliable EV.

Up cards of 2-6 have the same CDP and CDP1 split EVs for SPL1, SPL2, SPL3.

Up card 7
CDP1 SPL1 = 88.18%, SPL2 = 89.00%, SPL3 = 88.12%
CDP SPL1 = 88.18%, SPL2 = 88.88%, SPL3 = 87.82%

Up card 8
CDP1 SPL1 = 71.87%, SPL2 = 58.21%, SPL3 = 44.40%
CDP SPL1 = 71.87%, SPL2 = 58.18%, SPL3 = 44.31%

Up card 9
CDP1 SPL1 = 57.39%, SPL2 = 33.45%, SPL3 = 10.38%
CDP SPL1 = 57.39%, SPL2 = 33.43%, SPL3 = 10.27%

Up card 10 (dealer has checked for BJ)
CDP1 SPL1 = 8.364%, SPL2 = -18.35%, SPL3 = -41.64%
CDP SPL1 = 8.364%, SPL2 = -18.40%, SPL3 = -41.78%

Up card Ace (dealer has checked for BJ)
CDP1 SPL1 = 70.53%, SPL2 = 64.72%, SPL3 = 57.18%
CDP SPL1 = 70.53%, SPL2 = 64.72%, SPL3 = 57.45%

For this composition the only case where CDP is more optimal than CDP1 is for up card of ace. Hopefully my algorithm's OK. It seems to perfectly match MGP's values for CDP, 2-2 v 7, single deck, DAS.

CDP1 is fixed for all split hands whereas CDP is fixed depending upon pair cards removed from the starting composition.
Your algorithm seems to be working fine. As you mentioned later in this thread, my latest version will compute both CDP and CDP1 for comparison.

I think this highlights the challenge we encounter when trying to *optimize* a strategy, as opposed to the relative ease with which we can *evaluate* some particular strategies that are easy to specify.

Although it is often (almost always?) the case, it is *not* true in general that the split EV for CDP will *always* be >= CDP1. This may seem counter-intuitive, especially when you consider that the "component" expected values that make up the split EV *do* satisfy CDP >= CDP1. (By "component" EVs I mean, for example, EV[X;a,0] in my earlier notation, which is the expected value of playing out a single hand, without resplitting, given that a additional pair cards have been removed from the deck.)

The problem is that the overall split EV is just some linear combination of these component EVs... and the coefficients of those components are not necessarily positive. For example, in the 10-10 vs. 7 case above (actually, vs. any non-10 up card) for SPL3, the overall split EV may be expressed as:

Code:
- 10/9*EVp[0, 0] - 20/17*EVp[1, 0] + 15/17*EVp[2, 0] - 7/51*EVp[3, 0]
+ 2*EVx[0, 0] + 20/9*EVx[1, 0] + 40/17*EVx[2, 0] - 30/17*EVx[3, 0]
+ 14/51*EVx[4, 0]
In this case (indeed, in most cases, based on a few spot checks using the formula in the paper), EVx[3,0] is the culprit; although this component value is greater for CDP (0.36037129537129536) than CDP1 (0.35433011433011435), this actually hurts the overall split EV.

This is not confined to these small, "pathological" shoes. The same phenomenon occurs for 10-10 vs. 7 in single deck, it's just not as pronounced.

As I said at the start, this may seem counter-intuitive, since we are "using more information" in CDP1. But we are only using more information to optimize a "component" of expected value, a collection of which together contribute to the *overall* value that we unjustifiably expect to also be optimized.
 

k_c

Well-Known Member
ericfarmer said:
Your algorithm seems to be working fine. As you mentioned later in this thread, my latest version will compute both CDP and CDP1 for comparison.

I think this highlights the challenge we encounter when trying to *optimize* a strategy, as opposed to the relative ease with which we can *evaluate* some particular strategies that are easy to specify.

Although it is often (almost always?) the case, it is *not* true in general that the split EV for CDP will *always* be >= CDP1. This may seem counter-intuitive, especially when you consider that the "component" expected values that make up the split EV *do* satisfy CDP >= CDP1. (By "component" EVs I mean, for example, EV[X;a,0] in my earlier notation, which is the expected value of playing out a single hand, without resplitting, given that a additional pair cards have been removed from the deck.)

The problem is that the overall split EV is just some linear combination of these component EVs... and the coefficients of those components are not necessarily positive. For example, in the 10-10 vs. 7 case above (actually, vs. any non-10 up card) for SPL3, the overall split EV may be expressed as:

Code:
- 10/9*EVp[0, 0] - 20/17*EVp[1, 0] + 15/17*EVp[2, 0] - 7/51*EVp[3, 0]
+ 2*EVx[0, 0] + 20/9*EVx[1, 0] + 40/17*EVx[2, 0] - 30/17*EVx[3, 0]
+ 14/51*EVx[4, 0]
In this case (indeed, in most cases, based on a few spot checks using the formula in the paper), EVx[3,0] is the culprit; although this component value is greater for CDP (0.36037129537129536) than CDP1 (0.35433011433011435), this actually hurts the overall split EV.

This is not confined to these small, "pathological" shoes. The same phenomenon occurs for 10-10 vs. 7 in single deck, it's just not as pronounced.

As I said at the start, this may seem counter-intuitive, since we are "using more information" in CDP1. But we are only using more information to optimize a "component" of expected value, a collection of which together contribute to the *overall* value that we unjustifiably expect to also be optimized.
I think we're on the same page. The attached image is of a program I wrote in Excel that demonstates how I compute splits. You probably get multipliers mathematically whereas I do it programmatically. I don't think I know enough math to do it that way. I think we agree on the end result though.

The thing is the reason we can multiply in the first place is that a fixed strategy is used so I've always wondered about the validity of using a "variable fixed strategy" depending upon number of pair cards removed. It seems that if this "variable fixed strategy" is valid it should always be more optimal than a simple fixed strategy.
 

Attachments

iCountNTrack

Well-Known Member
k_c said:
Your program's EVs for the input composition/rules, 10-10 v 7 are a little different using version 7 of your program as compared to version 8.

However, these EVs are greater than either CDP1 or CDP in every case. This is understandable since you are computing with a more optimal method so your EVs should be >= CDP1 or CDP.

By the same token CDP EVs should always be >= CDP1 EVs but surprisingly they are not.

I was getting ready to add CDP as a more optimal computing option to my CA but first I thought I'd try a few limited compositions to see how it compares to my present most optimal way of computing splits. The very first composition I tried was {1,1,1,1,1,1,1,1,1,12} with the results I posted.

I checked the latest version of Eric's CA in which CDP is an option and get the same EVs with my CDP algorithm as he does so that seems to confirm the EVs I posted are right. I don't think I'm going to change my CA to include CDP after all.
Version 0.70 had a bug in it ( a variable was not initialized that was accumulating probabilities) , and version 0.80 introduced a bug because i wasnt using an updated function for DAS, :) both bugs are fixed now in version 0.82

http://code.google.com/p/blackjack-combinatorial-analyzer/downloads/list

I am learning the hard why deep nested coding is not a good thing, debugging is tedious...

On a separate but interesting note, I wanted to bring up a point that you made me think about.
When we say optimal strategy, it means calculating the EV of every possible/allowed playing decision taking into account all the cards that are seen, and then choosing the one that gives you the highest EV. However that is only half the story for post-split optimal decision!
For post-split decisions, to get the optimal strategy you will need to look at the GLOBAL ev, in other words the ev of THE ROUND, and not just at the LOCAL ev of THE SPLIT HAND. This is especially true for the first split hand, because each playing decision on the first split hand will have an effect on what hands will possibly be dealt on the second split hand. For second split hand (assuming you dont have one card in 3rd split hand), it is sufficient to look at the LOCAL best ev because any playing decision on second split hand does not have any effect on the cards dealt on the first split hand since that hand have obviously dealt already.

While I hate to admit it, :) my algorithm is sub-obtimal because while my StandFunc(), HitFunc(), DoubleFunc() return their respective EVs taking into account all the cards seen only calculate a LOCAL ev for the split hand being dealt and not the GLOBAL ev. On the other hand my ResplitFunc() does return a Global ev.
Modifying my code to include GLOBAL evs is simple, but this will make the calculation super slow even on my i7 990X.
I just think it is interesting that for splits including all the cards seen is not enough to nail them optimally. Nasty splits!
 
Top