Vodoo ploppies are right, we can shove our math and sims :)

Gamblor

Well-Known Member
#1
Say somebody offers you a game. There are 2 envelopes, one has X money, the other has 2X the amount.

You randomly choose an envelope, open it up, and it contains $100. Your given the choice to take whats in the other envelope instead. Probability theory tells us that we should switch to other envelope because the expected EV is:

0.5 x $50 + 0.5 x $200 = $125

WTF? This completely defies common sense, you would think it should not matter. Taken to extreme, lets say the envelope has X or 2X money again. You randomly choose envelope A (and not peak this time). The EV of envelope is B is 1.25 * X. So you take envelope B. But then, given the choice to take envelope A, its EV is 1.25 * X * 1.25, ad infinitum.

This is the "two envelop paradox" http://en.wikipedia.org/wiki/Two_envelopes_problem. There is no widely agreed upon consensus on how to solve this problem. At the very least there is no simple explanation, which you would think there would be.
 
#2
ok to the first part here it goes, it took me 30 seconds to figure this out...u have an envelope with $100, and another envelope has either $50 or $200 in it. of course you should switch envelopes because you are risking $50 for a chance at $100, and its a 50/50 proposition.

lets flip a coin...heads you give me $100, tails ill give you $50. want to play? okay now onto the second part as soon as i can figure out what you are talking about.
 
#3
and after you take envelop B, the EV for switching back to envelope A is either +50 or -100, and you know which it is because you have seen the money in both envelopes...sorry not much of a paradox from what i can see. gonna read the wikipedia article and see if theres anything tougher there.
 

Gamblor

Well-Known Member
#4
christopher1 said:
and after you take envelop B, the EV for switching back to envelope A is either +50 or -100, and you know which it is because you have seen the money in both envelopes...sorry not much of a paradox from what i can see. gonna read the wikipedia article and see if theres anything tougher there.
Your not peeking in 2nd example, so you don't know how much money is in either envelope. The article might explain better than my abbreviated example.
 

Canceler

Well-Known Member
#5
It’s late, and I might not be thinking about this right, but…

If we’re not looking inside the envelopes, it seems to me the EV of picking an envelope is 1.5X, and stays that way, even if we contemplate switching to the other envelope.

P.S. I didn’t read the article.
 

Gamblor

Well-Known Member
#6
Canceler said:
It’s late, and I might not be thinking about this right, but…

If we’re not looking inside the envelopes, it seems to me the EV of picking an envelope is 1.5X, and stays that way, even if we contemplate switching to the other envelope.

P.S. I didn’t read the article.
Yes, might be worth reading the article, might have summarized things a little too much.

But basically, if you say X is the envelope you pic, then there is a 0.5 chance the other envelope is 0.5X and 0.5 chance the other envelope is 2X. Which works out to:

0.5 * 0.5X + 0.5 * 2X = 1.25X

This is a little clearer explanation.

Determining the EV before you pick an envelope is a little tricky, right its 1.5X if X is defined as being the lesser of the two amounts, but if you were to say its the greater of the two amounts, than EV is 0.75X of course (0.5 * 0.5X + 0.5 * X).
 
#7
Paradox solved

Your first envelop has an expectation of X. You switch envelopes 1 time because the first switch has a positive Ev to 1.25*X but any further switching produces no expected gain because your envelope now is expected to have 1.25*X. There is no gain by switching again.
 

MangoJ

Well-Known Member
#8
To solve the problem, there is one essential information is missing: the setting.

The game provider will not put an arbitrarily high amount of money into those envelopes. He must have a scale - i.e. for psychological experiments in university it is usually $15 per hour.

Once you estimate the scale: take the envelope which is above the scale, and switch the envelope which is below the scale.
You will perform better than your supposed "always switch" strategy.
 
#9
The example you give as a paradox doesnt change the problem at all. You never know the amount in the second envelop so the amount in the first is irrelevant. It is an illusion. X and 100 have the same relative value. If it wasnt a paradox the first time it wasnt the second. Basic algebra. Took me 10 seconds to solve. You now have either 2x or x/2.
 
#10
Gamblor said:
Say somebody offers you a game. There are 2 envelopes, one has X money, the other has 2X the amount.

You randomly choose an envelope, open it up, and it contains $100. Your given the choice to take whats in the other envelope instead. Probability theory tells us that we should switch to other envelope because the expected EV is:

0.5 x $50 + 0.5 x $200 = $125

WTF?
I had trouble with it too. zg
 

Gamblor

Well-Known Member
#11
zengrifter said:
I had trouble with it too. zg
Monty Hall problem is not a paradox, and its widely agreed that its right to choose the other door.

I had some issue with it too, but the best way to think about it you have a 1/3 chance of picking the car. The key consideration is that you have a 2/3 of initially picking wrong (the goat).

In the 1/3 case you already picked the car, Monty can open any door. But in the 2/3 case you initially picked the goat, Monty just definitely removed/showed the other goat, thus the car is DEFINITELY in the other door. Thus its better to switch and take the 2/3 odds.
 

Gamblor

Well-Known Member
#12
Maybe I didn't explain as clearly, to avoid a long'ish post, but here is how it is formulated. With apparently valid premises, and apparently valid logical and mathemical deductions, it leads to the absurd conclusion that simply by switching envelopes, you increase EV. This is a paradox, and the question is, is there a flaw with this argument?


The switching argument: Now suppose you reason as follows:

1. I denote by A the amount in my selected envelope.
2. The probability that A is the smaller amount is 1/2, and that it is the larger amount is also 1/2.
3. The other envelope may contain either 2A or A/2.
4. If A is the smaller amount the other envelope contains 2A.
5. If A is the larger amount the other envelope contains A/2.
6. Thus the other envelope contains 2A with probability 1/2 and A/2 with probability 1/2.
7. So the expected value of the money in the other envelope is

1/2 * 2A + 1/2 * A/2 = 5/4 * A

8. This is greater than A, so I gain on average by swapping.
9. After the switch, I can denote that content by B and reason in exactly the same manner as above.
10. I will conclude that the most rational thing to do is to swap back again.
11. To be rational, I will thus end up swapping envelopes indefinitely.
12. As it seems more rational to open just any envelope than to swap indefinitely, we have a contradiction.
 

MangoJ

Well-Known Member
#13
Gamblor said:

The switching argument: Now suppose you reason as follows:

1. I denote by A the amount in my selected envelope.
7. So the expected value of the money in the other envelope is

1/2 * 2A + 1/2 * A/2 = 5/4 * A

8. This is greater than A, so I gain on average by swapping.


I would formulate the paradox more clearly. If optimal strategy would be to "switch" regardless of the selected envelope content, then this strategy would be equivalent to selecting the other envelope in the first place (without knowing content).

However both envelopes are equal, and hence switching the envelope regardless of content cannot be the (only) optimal strategy.

I would say the solution to the dilemma is: Any strategy is as good as any else (if you don't have additional information). It doesn't matter which envelope to chose or if you switch or not.
 
#14
Gamblor said:
Maybe I didn't explain as clearly, to avoid a long'ish post, but here is how it is formulated. With apparently valid premises, and apparently valid logical and mathemical deductions, it leads to the absurd conclusion that simply by switching envelopes, you increase EV. This is a paradox, and the question is, is there a flaw with this argument?


The switching argument: Now suppose you reason as follows:

1. I denote by A the amount in my selected envelope.
2. The probability that A is the smaller amount is 1/2, and that it is the larger amount is also 1/2.
3. The other envelope may contain either 2A or A/2.
4. If A is the smaller amount the other envelope contains 2A.
5. If A is the larger amount the other envelope contains A/2.
6. Thus the other envelope contains 2A with probability 1/2 and A/2 with probability 1/2.
7. So the expected value of the money in the other envelope is

1/2 * 2A + 1/2 * A/2 = 5/4 * A

8. This is greater than A, so I gain on average by swapping.
9. After the switch, I can denote that content by B and reason in exactly the same manner as above.
10. I will conclude that the most rational thing to do is to swap back again.
11. To be rational, I will thus end up swapping envelopes indefinitely.
12. As it seems more rational to open just any envelope than to swap indefinitely, we have a contradiction.
The flaw is in step 9. You could give either envelope the value of X the other would be the more desirable choice mathematically as an average of 5X/4. But you dont decide to throw away your initial assumption. If you assigned X to the other envelope you dont switch your first envelope. Your likelihood of choosing the right envelope still has not changed(50%) but the expected value of the mystery envelope is always larger than the value of the assumed envelope. That is the paradox both statements are absolutely true.

As for the Monty Hall problem. Lets call it the deal or no deal problem. You choose a case and have a 1/26 chance of choosing the $1,000,000 case. If the cases are revealed at random and the million is still in play at the end (only 2 cases remain), your case still has a 1/26 chance of having a million in it while the other case has a 25/26 chance of having the million in it. Logic dictates that there is a 50% chance that either case has the million but the odds you chose the million never changes. The chance you chose the right case is 1/26 at any time the chance that one of the remaining cases has the million is (25/26)/N where N is the number of other cases (not yours) still in play.

This doesnt seem to make sense until you factor in what happens when the million is revealed before you get to the last 2 cases. Now the probability of any of the remaining cases have the million is zero. If the cases are revealed in a none random fashion (the revealer purposely reveals what case he wants knowing the contents) the whole thing breaks down.

Now do you understand why statistical logic is so confusing. Statistically speaking this is the correct argument. A similar question was on the statistics final in college. I feel like my head is going to explode.
 
#15
What is the truth?

Wow, I really thought I would have some explaining to do. I chose an extreme example to have people see the fallacy that statistical analysis can run into. If anyone has remembered my posts on this subject knows I was top in my class in statistics but often followed the wrong logic. The previous post was the right statistical logic to get full credit when answering the test question but as card counters know all unseen cards(or cases) are treated the same.

I would say at the 2 case point in deal or no deal you had a 50% chance of the $1,000,000 and I would switch cases just in case the fact that my choice originally had a 1/26 chance of being right had some memory but clearly the line of reasoning that would be required to get the question right in statistics class is flawed. This is the example I had to demonstrate the flaw.

Lets say you bought a lotto ticket. The winners in this drawing are not determined by looking at the drawing and comparing numbers to your ticket but by your ticket being scanned by a machine and letting you see the results. When you purchased your ticket you had a 1/12,000,000 chance of winning. You figure Im busy and I probably lost anyway so you leave the ticket in your wallet and forget about it.

A week later you are watching the news and their lead story is someone won the lotto and hasnt scanned their ticket yet and there are only 2 tickets left to be scanned. Two statements are now true. You still have a 1/12,000,000 chance that you picked the right numbers. The other you have a 1/2 chance that you hold the winning ticket. Which statement leads you to the truth about your situation. Statistics can be used to say just about anything.
 

MangoJ

Well-Known Member
#16
tthree said:
I would say at the 2 case point in deal or no deal you had a 50% chance of the $1,000,000 and I would switch cases just in case the fact that my choice originally had a 1/26 chance of being right had some memory but clearly the line of reasoning that would be required to get the question right in statistics class is flawed. This is the example I had to demonstrate the flaw.
Be careful with those examples. I don't know DonD well enough, but the right decision depends on the game rules. If the moderator has to keep the million dollar box in play as long as possible, I would switch cases as late as possible.
If the moderator just opens random boxes, I would bail out on any reasonably offer (to reduce variance).
 

Gamblor

Well-Known Member
#19
Statistics is pretty straightforward when determining straightforward chance events (whats the chance you'll get heads on coin flip, whats the chance you'll get a royal flush, etc.,) but it starts getting into the realm of decision theory and such (where more information is revealed and you have choices) some strange things happen. Either the brain has a hard time comprehending it, or it gets tricky to correctly apply the logic and probability correctly (like Monty Hall) or leads to strange, possibly paradoxical results (like two envelope).
 
Top