Multiple Bets (Not BJ, but definitely theory and math)

Sonny

Well-Known Member
London Colin said:
Firstly, there seems to be a chicken-and-egg issue. Which should come first, the total bet to then be subdivided, or the individual bets to then be scaled in some way? It seems to me that if we calculate an overall total Kelly bet, based on the assumption of betting equal amounts on each sub-bet, then that total bet is invalidated as soon as we start resizing the sub-bets.
That is correct. After thinking about this some more I adopted your approach of determining the optimal ratio of Bet A to Bet B first, then using that to calculate the proper Kelly bet. I sent you a PM with the complete details including a revised solution to this problem. For those of you playing at home, I got rounded bet sizes of $338 for the 6:1 shot and $427 for the 7:1 shot for a 44.2%/55.8% split. Somehow I stumbled onto a fairly correct initial estimate even though my methodology was invalid.

-Sonny-
 

London Colin

Well-Known Member
A discovery

Apologies for reviving such an old thread, but I just recently stumbled upon some relevant resources, linked to from the wizardofodds site. They are the links to SBRForum.com at the foot of this page - http://wizardofodds.com/kelly

Unfortunately, the 'part III' article which is talked of in parts I and II does not seem to exist. But I did hunt around in the forum archive and found a number of interesting discussions of Kelly betting.

The Kelly Calculator purports to be able to compute the individual bets for a number of mutually exclusive or independent outcomes. I tried it with my dice example and it came up with a different result to Sonny's method:
$528.50 and $365.90.

I managed to track down a description of the algorithm being employed - http://www.sportsbookreview.com/forum/handicapper-think-tank/29624-simultaneous-event-kelly-calculator-beta.html

It took me several read-throughs, but I eventually got the gist of what the algorithm is doing (and was able to verify that it produces the same answer that the calculator is giving). But it's beyond my abilities to verify that the algorithm is correct, or to account for the difference from Sonny's result.

One interesting aspect is that negative EV bets may be included in the overall betting scheme. In the author's 'Example 1', only one of the four available bets has a positive EV, but three bets are made.

Sonny, would you mind taking a look at this and letting me know what you think? (I'd appreciate hearing anyone else's thoughts on this too.)

-----------------------------

For what it's worth, here's what I've been able to understand (or think I understand) about the algorithm -

Going back for a moment to the basic definition of the Kelly fraction for a single bet, one way to express the formula is
(p * Odds - 1) / (Odds -1)
where,
p = probability of winning
Odds = decimal odds of the payoff (e.g., 2.0 is even money)

Hence, for the role of a die (p=1/6), paying odds of 6:1 (7.0), the Kelly fraction is (1/6 * 7 -1)/(7-1) = 0.027778


The algorithm seems to be based on an alternative method of doing this calculation: If you divide the probability of losing (1-p) by the implied probability of losing (1 - 1/Odds), the resulting quotient is a measure of who has the edge, 1.0 meaning perfectly fair odds, < 1.0 meaning the player has an edge. I don't know if this quantity has a particular name, so I'll just call it 'the quotient'.

It seems that you can arrive at the Kelly fraction by calculating
'prob of winning' - 'the quotient' * 'implied prob of winning'
E.g., for the dice:
quotient = (1-1/6) / (1-1/7) = 0.97222
kelly = 1/6 - 0.97222 * 1/7 = 0.027778


What the algorithm seems to do is generate a set of bets which minimises the overall, cummulative quotient, and then use that overall quotient in the calculation of each of the individual bet sizes.


Step 4, the test for implied prob <1, only makes sense if every possible outcome has been specified. Otherwise, it seems we just continue in the same way as if that test had failed.
 

Sonny

Well-Known Member
I'll take a look at that article and see if I can make sense of it. The Kelly formula looks fine but I need to look at how he is weighting each bet.

One thing I notice right away is that his forluma is for uncorrelated bets whereas my numbers were for correlated bets. In the dice example, my numbers assumed that losing one bet made it more likely to win the second bet. I was applying both bets to a single roll of a die while he is applying each bet to rolls of separate dice. I'm not sure which method is more appropriate to your question.

-Sonny-
 

London Colin

Well-Known Member
Sonny said:
I'll take a look at that article and see if I can make sense of it.
Thanks. There isn't really an 'article' concerning this, as far as I know. I found parts I and II to be good reads, but part III, which was to be about simultaneous events, does not seem to have been written. The only information I could find was the short forum post which I linked to.

Sonny said:
The Kelly formula looks fine but I need to look at how he is weighting each bet.

One thing I notice right away is that his forluma is for uncorrelated bets whereas my numbers were for correlated bets. In the dice example, my numbers assumed that losing one bet made it more likely to win the second bet. I was applying both bets to a single roll of a die while he is applying each bet to rolls of separate dice.
I'm sure that is not his intention. This is supposed to be a method for handling mutually exclusive outcomes. His calculator app lets you choose between mutually exclusive and independent, so he presumaby has a separate method for the independent case.


Sonny said:
I'm not sure which method is more appropriate to your question.
My contrived example was the single roll of a die, where simultaneous bets are obviously mutually exclusive.

In the real-world applications I have in mind, all the simultaneous bets available would be correlated to some degree, most being mutually exclusive, but some not.
 

rukus

Well-Known Member
Apologies, don't have time to go back and read the whole thread, but i do recall blackjack avenger and i discussing a similar question (sort of) with Don S and company over at advantageplayer (i forget which subforum though) regarding composition dependent risk averse insurance and how to consider insurance plus main bet together to achieve an overall optimal risk adjusted EV. Basically it's just the that concept came from the book Beyond Counting (original), so take a read of his chapter on insurance if you have access. It may help you in answering your calculation question. Apologies if im late to the party on this thread.
 

London Colin

Well-Known Member
Thanks for the tip, Rukus. I don't have a copy, but I might use this as an excuse to justify getting one. :)

I don't think risk-averse insurance is directly comparable, but the ideas behind it might still be helpful. That's because you have already sized one bet and are now considering how much, if anything, to put on a second (insurance) bet, whereas what I'm trying to get to grips with is situations in which you have yet to make any bet.
 

rukus

Well-Known Member
London Colin said:
Thanks for the tip, Rukus. I don't have a copy, but I might use this as an excuse to justify getting one. :)

I don't think risk-averse insurance is directly comparable, but the ideas behind it might still be helpful. That's because you have already sized one bet and are now considering how much, if anything, to put on a second (insurance) bet, whereas what I'm trying to get to grips with is situations in which you have yet to make any bet.
Agreed, not directly comparable, but similar at a certain level and math may potentially give you some ideas for how you may need to size the bets. Hope it helps, check out the discussion on Don's forum for summary of BC idea in case yiu don't want to buy the book (though the book has the relatively straightforward math laid out if I recall correctly).
 

London Colin

Well-Known Member
sim

I created a sim of my dice game in order to test the result of using different bet sizes on RoR.

As I understand it, the probability of doubling your bankroll before halving it should be 2/3 (67%). Using the bet sizes produced by the Kelly Calculator gave a 67.4% success rate, which is pretty close.
 

London Colin

Well-Known Member
Sonny said:
I'll take a look at that article and see if I can make sense of it. The Kelly formula looks fine but I need to look at how he is weighting each bet.
Did you reach any conclusions?
 

London Colin

Well-Known Member
Sonny said:
Not yet. I'm still sifting through those links. I can't find anything that describes the formula for multiple correlated bets. Everything seems to focus on the uncorrelated version:

http://www.sportsbookreview.com/forum/handicapper-think-tank/24829-simultaneous-bet-kelly-staking-simplest-case.html

I'll keep looking.

-Sonny-
When I was looking, all I was able to find was the step-by-step description of the method for mutually exclusive outcomes, contained in the link which I posted. I actually find that easier to comprehend than the more formal, mathematical formulae, as given for the uncorrelated version, which tend to go over my head.:) There doesn't seem to be a proof given in either case, though. (Not that I'd be likely to understand it in any case.)

I don't think he has a general formula/method for correlated bets; just the specific case of mutual exclusivity.

While I'd like to gain as much understanding of the whole topic as possible, my immediate focus was to try to determine whether the stated method for mutually exclusive outcomes works, and if so, how it works. In particular, I'm unsure how one would go about modifying this method for fractional Kelly betting.

As I mentioned a couple of posts ago, I conducted a test which seems to indicate that it does work (i.e. the prob of doubling BR before halving is close to the expected 67% value). I ran the same test using the numbers you had computed, and the result was about 72%.

Thanks for your help,
 

Sonny

Well-Known Member
We must be talking about different circumstances. Based on the dice game you desribed, I came up with (rounded) bet sizes of $338 and $427. That gives a total of $765 bet, an EV of 25.97%, an ASR of 3.4 and a SD of about 1.83. With a $10k bankroll I get a RoR of 13.25% for that bet (technically the full bet should be $10,000 * 0.2597 / 3.4 = $763.823 but I rounded).

Maybe the game you're playing is different than the one I simulated? My version was for two +EV bets that are mutually exclusive but correlated.

-Sonny-
 

London Colin

Well-Known Member
Sonny said:
My version was for two +EV bets that are mutually exclusive but correlated.
I'm starting to wonder if the words correlated/uncorrelated mean what I thought they meant, as I can't quite understand what you mean by that.

What I thought they meant was -

Uncorrelated: Independent events. If you know the outcome of one event, that tells you nothing about the other. E.g., if you roll two dice (or the same die twice), and tell me the outcome of one roll, the six possibilities for the other remain 1/6 probabilities.

Correlated: Related events. Mutual exclusion being one such possible relationship. E.g., if you roll a single die and tell me that the outcome was not a particular number, then I know that the remaining possibilities now have a 1/5 probability.

But I guess the above can't be quite right, because what then does 'mutually exclusive but correlated' mean? I had thought that mutual exclusion was, by definition, a particular form of correlation.


At any rate, the particular scenario I simulated was simply this:

  • A single die is about to be rolled.
  • You may bet on one number which has an associated payoff of 6:1.
  • You may bet on another number which has an associated payoff of 7:1.
  • Once you have made both bets, the die is rolled and:
  • 4 times in 6, you lose both bets.
  • 1 time in 6, you win one bet and lose the other, earning 6:1 on the winning bet.
  • 1 time in 6, you win one bet and lose the other, earning 7:1 on the winning bet.
Repeat until rich. :)

You could transpose the above to a more real-world example: a horse race. You assess that two of the runners each have a 1/6 chance of winning the race. You are offered odds of 6:1 and 7:1 to back them to win. There can be only one winner. How much do you bet on each?
 

Sonny

Well-Known Member
It sounds like you understand the definitions just fine. Maybe I'm using them differently.

London Colin said:
...what then does 'mutually exclusive but correlated' mean?
The bets are mutually exclusive because they cannot both happen. Either one will occur or the other, or neither. The die will never land on both numbers.

The bets are correlated because the occurence of one has an effect on the other. If you win bet #1 then you also lose bet #2. The amount of money you win is correlated to the other bet.

As you said, mutual exclusion is a form of correlation. My concern was that you were talking about two bets that were independent but not mutually exclusive. The formulas in those links seem to be for bets on different sports games, which are not mutually exclusive. If it is a parlay then it is correlated, otherwise not.

London Colin said:
  • A single die is about to be rolled.
  • You may bet on one number which has an associated payoff of 6:1.
  • You may bet on another number which has an associated payoff of 7:1.
  • Once you have made both bets, the die is rolled and:
  • 4 times in 6, you lose both bets.
  • 1 time in 6, you win one bet and lose the other, earning 6:1 on the winning bet.
  • 1 time in 6, you win one bet and lose the other, earning 7:1 on the winning bet.
What were the results? Did they correspond to my numbers from earlier? How are you calculating the RoR?

http://www.blackjackinfo.com/bb/showpost.php?p=157091&postcount=15

London Colin said:
I tried it with my dice example and it came up with a different result to Sonny's method: $528.50 and $365.90.
When I use those numbers I get an EV of $237.15 (26.51%), a SD of $1671.02 (1.87 units) and a RoR of >18%. :confused:

-Sonny-
 

London Colin

Well-Known Member
Sonny said:
It sounds like you understand the definitions just fine. Maybe I'm using them differently.

The bets are mutually exclusive because they cannot both happen. Either one will occur or the other, or neither. The die will never land on both numbers.

The bets are correlated because the occurence of one has an effect on the other. If you win bet #1 then you also lose bet #2. The amount of money you win is correlated to the other bet.
That's a relief. Glad we are speaking roughly the same language, after all. It was just your use of the word 'but' that threw me. If they are mutually exclusive then they can't be anything other than correlated; 'mutually exclusive but correlated' is a kind of tautology, like saying 'a monkey but a mammal'.:)


Sonny said:
As you said, mutual exclusion is a form of correlation. My concern was that you were talking about two bets that were independent but not mutually exclusive. The formulas in those links seem to be for bets on different sports games, which are not mutually exclusive. If it is a parlay then it is correlated, otherwise not.
As I said, the only 'formula' I've been able to find for mutually exclusive bets is the set of step-by-step instructuions (i.e. the algorithm). I'm wondering if the link I posted actually took you to the correct post, or just to the start of the thread and you didn't see the post I wanted you to see. So I'll quote it directly, just to be sure -

Anyway, here's a plain-English description of the algorithm to calculate Kelly stakes on mutually exclusive events.

  1. Sort all bets by edge, from highest to lowest.
  2. Calculate the fair implied probability for each bet. This is just the reciprocal of the decimal odds.
  3. Starting with the highest edge bet, calculate a running total of the the implied probability and the actual probability. The running total for each bet includes the sum of the implied and actual probabilities for that bet and every bet with a higher edge.
  4. If the sum of all the implied probabilities is less than 1 (i.e., a true arb exists), then for each bet the stake will be the actual probability. If this is the case, we can stop here.
  5. If the sum of all the implied probabilities is greater than 1, then for each bet calculate the quotient (1 – the sum of actual probabilities) / (1- sum of implied probabilities).
  6. Find the smallest value of this quotient that’s greater than zero. If no quotient is greater than zero then no bets will be made.
  7. Then for each bet the stake will be the actual probability minus the minimum quotient from 6) above multiplied by the fair implied probability.
I fed the details of my dice game into his Kelly Calculator (including specifying 'Exclusive Outcomes' in the drop-down list at the top left), and it spat out an answer: $528.46 and $365.85.

I then went through the above algorithm by hand (but ignoring step 4, which only makes sense if you have a complete set of bets, covering every possible outcome). It gave me the same answer as the calculator (allowing for rounding errors).


Sonny said:
What were the results? Did they correspond to my numbers from earlier? How are you calculating the RoR?

http://www.blackjackinfo.com/bb/showpost.php?p=157091&postcount=15



When I use those numbers I get an EV of $237.15 (26.51%), a SD of $1671.02 (1.87 units) and a RoR of >18%. :confused:

-Sonny-
So now I had two rival answers: ($365.85, $528.46) from the calculator, and your figures of ($338, $427).

Not having any understanding of how to check their validity analytically, I decided to use the brute force and ignorance approach. Using a given pair of bet sizes, my program simply keeps rolling the die and processing the bets until the bankroll is doubled or halved, then repeats the process a few million times to get the probability of successfully doubling the BR before halving it. As I understand it, this should be 66.67%

I did initially start out counting the failures in a double-or-bust test, hoping to see a result of 13.5%, but I hit a snag. What should you do when your BR gets down to less than you need to make the two bets, but still greater than zero? i.e. at what point is 'ruin' actually reached?

I found it made quite a difference if I switched between two options: quit as soon as you don't have enough BR left, versus play one more round (notionally 'borrowing' the shortfall). So I changed to the double-or-half approach. Possibly there is still a similar issue here, lurking in this methodolgy, a product of the fact that the simulated BR is inevitably changing in discrete steps, not in some smooth, analogue fashion, which theory might require?

At any rate, the results I got by this method were -
Code:
$365.85, $528.46 : c. 67.4% chance of doubling before halving.
$338,    $427    : c. 72.2% chance of doubling before halving.
 
Top