(Close Window) 
Topic: Two envelopes 


A good friend of mine (a mathematics/logic professional) presented this puzzle to me about ten years ago. It is a wellknown paradox. It may have been discussed here before  I can't remember. So here it is again: There are two identical envelopes lying on the table. You are told that they each contain a sum of money, and that one of them contains exactly twice as much money as the other. You cannot tell from the appearance or weight of the envelopes (or by any means at all) how much money is involved or which contains the greater sum. You are invited to select one of the envelopes and keep the contents. You pick up one of the envelopes. Before you open it, you are asked if you would like to swap the envelope you hold for the one on the table. What is the best course of action for you? Do you keep what you have, or do you swap? 


I do know the reasoning that causes the discussion, so I guess I can present it here. Let's say the envelope you hold has X dollars in it. That means that the other envelope has 2X dollars or X/2 dollars in it with equal probability. Your expected winnings of switching would be 0.5*2X + 0.5*X/2 = 1.25X while if you keep your envelope your expected winnings is X. So you are better off switching. That of course makes no sense, so what is the error in the reasoning? /Tomas 


It just shows that probability sucks! You should reason that the average amount in the envelope that you hold is (1X+2X)/2 = 1.5X  as the probability is that the other envelope contains 1.25X then you would be better off keeping your envelope! 


[quote] On 20060316 02:41, TomasB wrote: I do know the reasoning that causes the discussion, so I guess I can present it here. Let's say the envelope you hold has X dollars in it. That means that the other envelope has 2X dollars or X/2 dollars in it with equal probability. Your expected winnings of switching would be 0.5*2X + 0.5*X/2 = 1.25X while if you keep your envelope your expected winnings is X. So you are better off switching. That of course makes no sense, so what is the error in the reasoning? /Tomas [/quote] this is a very deep paradox! there are many explanations on the web for it. the one I am sticking with is the fact that there is no PDF (probability districution function) that supports the above condition. this is just on one leg, I will try to find a better explanation phrased by another. nir 


For me the error is in saying that the expected value of the one you hold is X and yet use that as a 100% fact that it contains X. You can't use X in both ways since the expectation of what you hold is not the same as what's actually in the envelope. (Throw and cover a die before looking at it. The expected value is 3.5 but I assure you that it is not showing that.) /Tomas 


I would point suddenly behind my friend and shout, "What is that!!!!?" As he turns around, I'd pick up the other envelope and run off with both. :rotf: Greg 


[quote] On 20060315 19:04, Steve Martin wrote: There are two identical envelopes lying on the table. You are told that they each contain a sum of money, and that one of them contains exactly twice as much money as the other. You cannot tell from the appearance or weight of the envelopes (or by any means at all) how much money is involved or which contains the greater sum. You are invited to select one of the envelopes and keep the contents. You pick up one of the envelopes. Before you open it, you are asked if you would like to swap the envelope you hold for the one on the table. What is the best course of action for you? Do you keep what you have, or do you swap? [/quote] The puzzle becomes much more interesting if you are allowed to open your envelope before deciding whether to switch. You know one envelope contains twice as much as the other. You open your envelope and see $20  now you know that the other envelope contains either $10 or $40, so if you switch you either lose $10 or gain $30. So obviously the odds are in your favour if you switch.  Regardless of which envelope you are holding??? 


That's a wonderful and a bit annoying idea, John. So if you switch, your expected gain is 0.5*200.5*10=5 dollars since you know with 100% certainty what amount you hold at the moment. That reasoning would always tell you to switch and I can't really see where the error in the reasoning is but there of course has to be one. /Tomas 


Well, here's some infobut my math education isn't good enough to make a whole lot of sense out of it: http://consc.net/papers/envelope.html Is it possible to simplify this explanation? My limited understanding of this article is this: if the amount of money in each envelope is chosen from a restricted range, then it is not necesarily true that the chance of getting the higher amount is equal to the chances of getting the lower amount. For example, if we knew the amount of money in the envelope was chosen randomly from the numbers between 0 and 1000, then if one envelope held an amount around 100, then there is actually a greater chance that the other envelope has a number around 200 than a number around 50. Hence, no paradox. This much I think I understand. But I don't get what happens when the range is not restricted. The argument in the article seems to imply that, once again, it is not necessarily true that the chances of expecting the higher number is the same as expecting the lower number. At least I think that's what the article is saying. But I don't understand the reasoning here. Anyone here able to write Probability for Dummies? (BTW I had the television on in the background when I was writing this. The moment I typed the word Probability, a person on the television simultaneously said Probability! What's the probability of that? Interesting to try to guesstimate) Jack Shalom 


[quote] On 20060322 20:43, landmark wrote: Well, here's some infobut my math education isn't good enough to make a whole lot of sense out of it: http://consc.net/papers/envelope.html Is it possible to simplify this explanation? My limited understanding of this article is this: if the amount of money in each envelope is chosen from a restricted range, then it is not necesarily true that the chance of getting the higher amount is equal to the chances of getting the lower amount. For example, if we knew the amount of money in the envelope was chosen randomly from the numbers between 0 and 1000, then if one envelope held an amount around 100, then there is actually a greater chance that the other envelope has a number around 200 than a number around 50. Hence, no paradox. This much I think I understand. But I don't get what happens when the range is not restricted. The argument in the article seems to imply that, once again, it is not necessarily true that the chances of expecting the higher number is the same as expecting the lower number. At least I think that's what the article is saying. But I don't understand the reasoning here. Anyone here able to write Probability for Dummies? (BTW I had the television on in the background when I was writing this. The moment I typed the word Probability, a person on the television simultaneously said Probability! What's the probability of that? Interesting to try to guesstimate) Jack Shalom [/quote] that is exactly what I meant by saying that the PDF has to be fixed before. N. 


It sounds like the perfect problem to simulate. I do however think a simulation would prove them wrong. According to that statement you _should_ switch when you see 100 in the envelope, right? I don't have time to make any simulations now but if someone does, it could look like this: 1. Chose a number from 0 to 1000 with equal probability and store in A. 2. Put half of that number in B. 3. Flip a cybercoin to decide if you look in A or B. 4. Upgrade a counter the times you see 100 and upgrade another counter if at the same time there is 200 in the other envelope. 5. Repeat millions of times and check the two counters. If the second counter is more than half the first counter we know that it's a higher probability to see 200 in the other envelope. /Tomas 


10127447 trials gave 10149 envelopes with 100 in, and in 5069 of these cases the other envelope contained the 200. (Not sure how good the random number generator is though!) _________________________________________________________________ I think the solution to this problem lies in the fact that you can't lose  you haven't paid anything to play. If we change the deal to "After you have picked and exchanged envelopes, if you wish, you keep the envelope you are holding, but pay me the value of the lesser envelope", then it is pointless switching....? Or... If one envelope is empty, then it is pointless switching (Which is the same thing really.) I think the mathematical answer is going to be something on the lines of: You have to assume that at the end of the game you should pay the ACTUAL VALUE for an envelope  in fairness the ACTUAL VALUE is the value of the total contents of the envelopes divided by 2. Even when you know what your envelope contains ($20), you don't know the ACTUAL VALUE of the envelope that you hold. Your envelope contains $20. If the other envelope contains $10 then the ACTUAL VALUE of each envelope would be $15  stay with what you have and you are $5 up, switch and you are $5 down  a 50/50 bet. Your envelope contains $20. If the other envelope contains $40 then the ACTUAL VALUE of each envelope would be $30  stay with what you have and you are $10 down, switch and you are $10 up a 50/50 bet. Nevertheless, in the situation given, it still seems advantageous to switch? 


[quote] On 20060323 08:09, magicjohn2278 wrote: 10127447 trials gave 10149 envelopes with 100 in, and in 5069 of these cases the other envelope contained the 200. [/quote] So you would win by switching those times and it's only 0.499 of the times according to the simulation. IF the reasoning in the article above was correct it should have been noticably larger than 0.5 but it isn't. My guess is that they were wrong. Thanks for doing the simulation, John! /Tomas 


Adding some conditions to the original might make it paradoxical. As it stands, regardless of which you pick, or even if the person offering can tell which is which, they make the offer, hence nothing to compute. Just take one, and of course remember to say thank you before opening your gift envelope. Add in one or both conditions and you get something else. 


... so why is the expression "one contains TWICE as much as the other" so significant? If you are told one envelope contains MORE than the other, then switching looks like a 50/50 proposition. If you were told that one envelope contains 1.5 times the other, then it's a bad bet. If you are told one contains twice the amount or three times the amount, (or more) than the other, it becomes a good bet. Why? If you are told that one envelope contains MUCH MORE than the other, then you would be wary of switching, as you stand to lose quite a lot if you already hold the winning one, but the situation is very much the same? (by MUCH MORE, then I think we could infer that one contains more than twice as much as the other.) What's going on? 


Simplification: You are given permission to walk into a room and pick up one of two envelopes on a table. You pick up one envelope and hear a speaker in the room announce: If it's Tuesday, the other envelope contains more money And then you... Correct me if I'm missing something here, it seems to me that the added bit about either being offered or forced to switch envelopes is impertinant. Till you have some data to work from, there is nothing upon which to base an analysis. Is there some data I've overlooked in this analysis? 


You are told that one envelope contains twice as much as the other, then if your envelope contains $20, then the other contains either $10 or $40. Switching appears to be a $10 bet at 3 to 1 odds  switch and risk losing $10 or gaining $20. ... But having done so, you are in exactly the same position... 


Using my own logic: 2 identical envelopes, 1/2 of getting higher money. knowing what is inside one of the envelopes doesn't affect it since you have no idea if its low or high and that doesn't change the contents of the envelopes... you are still left with 1/2 of getting either. Anything you add to it is going to be wrong imho.... 


If your analysis leads you to a paradox At least one of our axioms or premises must be faulty. So pick them out one at a time And find out which lead you into the trap. Above, I removed the "let x be the dollars in the envelpe in my hand" and replaced it with "at the end, when I open my envelope and enjoy the money, I will not know whether the other one contains half or twice as much". Ie limited information. That seemed to greatly simify the analysis and remove the paradox. 


To Simplify and Expand: What we have is 2(X) envelopes with different amounts of money, we pick one, are shown the contents and then offered to switch. Removing the double, Let's say $1 and $100 (but we don't know if they are 50 cents or $1000 in other) Let's say we always switch: 1/2 we pick $1, see it and switch> WIN  $100 1/2 We pick $100, see it and switch> LOSE  $1 Let's say we always stay: 1/2 we pick $1, see it and stay> LOSE  $1 1/2 we pick $100, see it and switch>WIN  $100 So 2 out of 4 we lose, 2 out of 4 we win. Extrapolated: We have 100 envelopes with varying amounts of money, we pick one, are shown the contents, and are given the option to switch to another single envelope.... If we know they are in a certain range than obviously if you are under average of total you should switch.... however because whether the envelope is half or double(and in this case we have no idea of its value in relation to others), then you have the same chance of picking highest on your first 1/100 try than your second. This means that knowing the value of your envelope is MEANINGLESS to the problem, since you don't know the total range. IE you could already have the highest with a $5 if the rest are $1's, or you could have the highest with $1 if the rest are 50 cent pieces.... If any of this is doubted just go through it again above, where it is seen its always 1/2 chance of winning or losing. Adding anymore math or logic to a solved problem just creates problems. 


Both Jonathan and Daegs seem to be missing the point. Jonathan, you say that the statement "let x be the dollars in the envelpe in my hand" could be the thing creating the paradox, yet that statement is undoubtedly true if you in fact open the envelope and find x dollars. So the question is _why_ does the paradox occur from such a simple and true statement if that in fact is the statement causing the paradox? Daegs, everyone _knows_ that it should not matter if you switch or not (if you do not have a clue of the distribution). Therefore there must be some error in the reasoning that tells you to switch since the reasoning clearly shows that you _should_ switch. So the puzzle is to find the flaw in the reasoning that tells you to switch, _not_ find a reasoning that shows that it doesn't matter if you switch. Again  we _know_ that it is true that it shouldn't matter if we switch. /Tomas 


Congrats Daegs! I think you are right. In all fairness to Daegs. He does answer the given puzzle. The particular reasoning discussion was added by TomasB and Mr. Martin has yet to comment if his puzzle was to be modified with TomasB's discussion. If we all agree with Daegs gut level reasoning as the solution to puzzle 1, and agree to logically dispute Tomas' flawed logic puzzle (post #2) and MagicJohns second post, I think we'll all get along. (Jonathan's right too) 


You guys are too deep for me! Switch it! Don't switch it! Either way, you are now richer by X amount of dollars! You're better off than when you entered the room! Go to the local bar and celebrate your good fortune! 


[quote] On 20060324 23:18, Daegs wrote: Let's say we always switch: 1/2 we pick $1, see it and switch> WIN  $100 1/2 We pick $100, see it and switch> LOSE  $1 Let's say we always stay: 1/2 we pick $1, see it and stay> LOSE  $1 1/2 we pick $100, see it and switch>WIN  $100 So 2 out of 4 we lose, 2 out of 4 we win. If any of this is doubted just go through it again above, where it is seen its always 1/2 chance of winning or losing. Adding anymore math or logic to a solved problem just creates problems. [/quote] So in 2 out of 4 cases you would lose $1 and in 2 out of 4 you would win $100  so you you seem to have proved that switching is a good idea? Risking losing $1 on a 50/50 chance of winning $100 seem to be the best thing to do!  But why? 


Did anyone add a calculation to factor in what you would do if the envelope you opened contained an odd number (e.g. $1000.01). I think the payoff odds should be kept seperate from the odds of payingoff (50/50) (I just reread John's above post. OOOOW My head hurts.) Posted: Mar 25, 2006 9:23am  MagicJohn You are also risking 50 cents if x= 50cents. A bad bet on a different scale. 


So I posed this question to some high school math teachers; the general consensus was this: There's no paradox. Both envelopes are good. Given the information you have, switch as much as you want. Stop switching when you want. Go home with some money. They were not troubled by this at all. I was troubled that they were not troubled. But maybe Jon is right maybe there is not enough information yet to induce a paradox. Yet Jon, in your counterexample, you say a voice comes on and says "If it's tuesday. . ." Well then, a rational person would try to figure out if it was Tuesday and decide on a switch based on that answer. If one had no idea what day it was, then a rational person would not switch as there is only a 1/7 chance that it is Tuesday etc., In the same way, the info that there is either double or half the money in the other envelope seems to point to switchingbut perhaps [i]this[/i] is the flawed premiseperhaps this information isn't information at all. Perhaps in some way this information is similar to a statement like "the other envelope is twice as wide or half as wide." I don't really believe this, but perhaps this is a path to investigate. Jack Shalom 


Oh right, I only skimmed post after first...anyway [quote]Let's say the envelope you hold has X dollars in it. That means that the other envelope has 2X dollars or X/2 dollars in it with equal probability. Your expected winnings of switching would be 0.5*2X + 0.5*X/2 = 1.25X while if you keep your envelope your expected winnings is X. So you are better off switching.[/quote] The error comes with "0.5*2X + 0.5*X/2 = 1.25X"... that is wrong. You shouldn't use .5X, use X and 2X for your math since either way you are getting double that other. By comparing X/2 to 2X you are in effect doubling the bounds of other envelope. So 1/2 of the time > ".5*X + .5*2X = 1.5X" Other 1/2 of time > ".5*2X + .5*X = 1.5X" its straight down the halfway mark. 


A slightly different version of the game, just to confuse things. ;) I tell you that I have placed a, to you, unknown amount in an envelope that I give to you. I then flipped a coin to decide if I should place half of that amount or double that amount in another envelope. I offer you to switch. Would you? Here's the last time this was up for discussion here, by the way: http://www.themagiccafe.com/forums/viewtopic.php?topic=68360&forum=101 /Tomas 


Daegs and Jonathan, what it all boils down to is; I give you a sum of money. If you wish,you may toss a coin  If it comes down heads, I double your money. If it comes down tails, you give me just half of what you have. Do you toss the coin? Is it worth taking the bet? 


Magicjohn, That's an interesting simplification, but I don't think there's any paradox thereflip the coin. Jack SHalom 


Don't flip... personally I'd rather have $100 for sure than a very real possiblity of $50.(even if I could win $200). The key here is that you don't know whats in the envelope. 


Actually, Tomas' link to the prior discussion is worth reading. Arguments all the way around seem more cogent there :) Jack Shalom 


[quote] On 20060325 19:52, Daegs wrote: Don't flip... personally I'd rather have $100 for sure than a very real possiblity of $50. (even if I could win $200). The key here is that you don't know whats in the envelope. [/quote] But the point is that you CAN look in the envelope if you wish... even if you don't, you know that the other contains either double or half of what you are holding. So is it to your advantage to switch? ... probably yes? Posted: Mar 25, 2006 8:23pm  Jack, the situation is exactly the same with the envelopes... If you switch, you either lose only half what you have, or double it! ... but this can not be...? 


[quote] On 20060325 16:13, magicjohn2278 wrote: Daegs and Jonathan, what it all boils down to is; I give you a sum of money. If you wish,you may toss a coin  If it comes down heads, I double your money. If it comes down tails, you give me just half of what you have. Do you toss the coin? Is it worth taking the bet? [/quote] Yes, it's clearly worth taking the bet. On the flip of coin, I'd take risking 1/2 to doubling my money any day. The paradox seems to have deepened. So is the above coin flip really analagous to switching the envelope? It can't be. Posted: Mar 25, 2006 11:50pm  [quote] On 20060325 19:52, Daegs wrote: Don't flip... personally I'd rather have $100 for sure than a very real possiblity of $50. (even if I could win $200). The key here is that you don't know whats in the envelope. [/quote] You would? Why? Unless you needed that $100 for a medical emergency that's illogical thinking. Particularly if you were given this option numerous times. You're risking $50 to win $100. Over time you'd make a killing opting to bet on the coin flip. 


Double or half? Coin landed heads? Amount in envelope... nope, that is impertinent. Still... which data are pertinent? Till BOTH are opened or there is sufficient data to compute expectations there is nothing to calculate. Let x be the number of washing machines on the street... or what you will still not pertinent data. If you require proof, take a few singles into envelope pairs and ask the homeless on the street to pick an envelope. Watch what happens. Listen to their common sense answers. They wind up with a buck or two. And you might get free of an inefficient perspective. BTW the "if it's Tuesday" thing from an earlier post was an example of impertinent data. To make a Monty Hall problem out of this, add an envelope and have fun. :) 


A bird in the hand is worth two in the bush.... I'd rather have a sure $100 than a chance at $200 (with a chance of losing $50).... but that's just me(yes if you can do this over and over I'd be fine with going for it... but once the risk is not worth it, I cant even go out to eat with $50....) Anyway, that is all different than the envelope problem. The problem is the faulty assumption that there are 3 values in play, 1/2X, X and 2X. [b]There is only X and 2X!!![/b] (or *only* X and X/2) There is not $50, $100 and $200 in play, there is only either $50 and $100 or only $100 and $200. It is bad logic, imho, to say that the other envelope contains either 2X or X/2... it doesn't, it contains only one of those. Let's say you have two envelopes, one with X and one with Y(where Y = 2X). You pick an envelope, there is a .5 chance that you picked X and a .5 chance that you picked Y. If you picked X then the other envelope contains Y, if you picked Y then the other envelope contains X... at no time does X/2 come into it. The other envelope contains ONLY the value it originally started with, it is not changing... EDIT: I Give up, this truly is a paradox... you are right I concede, there is no right answer.... Look at it this way: 1: Your envelope contains X. By switching you either lose X/2 or gain X, so the amount you gain is greater than the amount you lose. 2: The amounts in the Envelopes are Y and 2Y, by switching you either lose Y or gain Y, so the amount you gain is the same as the amount you lose. But the envelope you open up can be either case as its just a constant. Again I give up, this truly is a paradox. 


Daegs.. don't give up so easily! I was thinking about this this morning and am beginning to think there may be no paradox here, just bad maths! I am given an envelope, and given to chance to switch for one containing half or double the one I hold with a 50/50 probability. I think my options are: Keep what I have and gain nothing. Switch what I have and lose half. Switch what I have and double my money. which most people would see as a reasonable bet and go for the switch. But am I confusing odds and probaility here? After all, the bet can only be made once, if we switch twice, there is a 100% proaility that nothing has changed. We all know that switching will only give one of two results  you either win or lose some cash. So are the possiilities above a fair representation of the options?  Possibly not. Perhaps the options are these (actual figures for illustration only.) Do nothing (Holding the $20)...No change Do nothing (Holding the $40)...No change Switch (holding the $20).......Gain $20 Switch (holding the $40).......Lose $20 50/50 chance of winning or losing $20 now  So where did my favourable gamble go??? ? 


[quote] On 20060326 06:22, magicjohn2278 wrote: Perhaps the options are these (actual figures for illustration only.) Do nothing (Holding the $20)...No change Do nothing (Holding the $40)...No change Switch (holding the $20).......Gain $20 Switch (holding the $40).......Lose $20 50/50 chance of winning or losing $20 now  So where did my favourable gamble go??? ? [/quote] The above information seems overly simplified. It seems to assume that if you're staring down the $20, the other envelope must contain $40. Or if you're staring down the $40 the other envelope must contain $20. What about the chance of the other envelope containing $80? Or if you're staring down the $20 the other envelope containing $10? Now I'm further confused. 


First step in solving a word problem is to clearly state your working premise. In this case, Let x be the value in dollars of the currently selected envelope. Then the value in dollars of the money in the other envelope is EITHER x/2 or 2x and you have insufficient data to determine which. The rest of the analysis appears to be wheels spinning though without contact to the ground, and no useful conclusions are reached. The part I find amusing is that folks forget to say "thanks" for the gift of what is inside the selected envelope. 


Alternatively: You hold either X or 2X.. Do nothing (Holding X)...No change Do nothing (Holding 2X)...No change Switch (holding X).......Gain X Switch (holding 2X).......Lose X 


I think you've expressed there my problem with the puzzle. I don't know the full explanation for the paradox (which I believe is pretty complex stuff) but it has always seemed to me that if you say "I hold X and therefore the other one contains either 2X or X/2" you are actually describing two different siutations  envelope 1: X envelope 2: 2X or envelope 1: X envelope 2: X/2 whereas in reality only one of these situations exists. I think there is a counterargument to this line of thinking, but I can't remember what it is. 


But still, both of these statements are true AND fit the problem exactly, though still result in different outcomes which is a paradox: 1: Your envelope contains X. By switching you either lose X/2 or gain X, so the amount you gain is greater than the amount you lose. 2: The amounts in the Envelopes are Y and 2Y, by switching you either lose Y or gain Y, so the amount you gain is the same as the amount you lose. 


Maybe if divided the problem in half we might get a better answer. you walk into one room and have a choice of two envelopes. you take that envelope into another room and have a choice of exchanging it for an envelope that is on the table there. does this make for less of a paradox? 


Depends if we know that new envelope is relational to the envelope we have. I *believe* the problem comes from believing that we have more information about our envelope than we have, by assigning it X when we don't know if its X or 2X(or X or X/2). It's assuming that its X and then basing the other envelope off it that is the problem, and yet it seems foolish to not be able to assign a variable name to our envelopes money especially when we know its a constant. The more you think about it and come up with explanations, the more it is clear that this actually *IS* a paradox. The only way to make this not a paradox, would be to explain how we could have an X amount in our envelope and NOT state that the other envelope has either X/2 or 2X inside it, however given that we know X and know the other envelope is either half or double, that is impossible.... so its a paradox. I believe a whole new branch of mathematics would need to be invented to explain this away... I predict this will remain a paradox for at least a couple hundred years.... 


[quote] On 20060327 00:27, Daegs wrote: I believe a whole new branch of mathematics would need to be invented to explain this away... I predict this will remain a paradox for at least a couple hundred years.... [/quote] [b]Wrong![/b] Read on.... No paradox…. Just Tomas’s faulty maths! :rotf: In his calculation of expected outcome, where X is the value of the envelope that you start of with: (0.5*2X)+(0.5*X/2) = 1.25X Because the expected gain 1.25X is greater than 1 we assume that this shows that it is more favourable to switch. – It doesn’t! The maths is flawed! The expression (0.5*X/2) = 0.25X is in error. It gives a positive expectation of winning on a bet giving 1 to 2 odds (and you don’t get your stake back.) That is to say, for each bet of 2 units, if you win, you only get paid 1 unit, which obviously has a negative expectation, which is the equivalent of X/2 but the negative outcome isn’t reflected in the calculation. It shows a 0.25X positive outcome. … and before you say, “Well now the calculation shows that it is less favourable to switch – so that is just another paradox!”, it doesn’t! It is just the maths that are at fault! …pleased that I cleared that one up! And as for his essay The TwoEnvelope Paradox: A Complete Analysis? David J. Chalmers Department of Philosophy University of Arizona Tucson, AZ 85721 Well! :rolleyes: Posted: Mar 27, 2006 6:17am  The fundamental error in the calculation is saying if my envelope contains X then the other contains EITHER X/2 or 2X. If I hold the envelope with the higer amount in, then the other [b]never[/b] contains 2X, likewise if I have the lower amount, the other [b]never[/b] contains X/2. 


Well, as I said I above  that is what I have always had a problem with in this puzzle. But I have been told that this argument does not get to the root of the paradox. 


Perhaps: Probability of making a profit by switching = Possible gain  Possible loss P=((2X)X)(X/2) = 0.5 ? 


As has already been mentioned here, the paradox is resolved by noting that the probabilities of there being 2X or 0.5X in the other envelope are not 50%50%. The thing is that you may not know what they are, but it's irrelevant. 


Anyone here want to do the tableaux demonstration? false( it is to your advantage to switch envelopes )  false (The other envelope has more money inside )  and break it down from there? What one expects from this tree structure are the suppositions and presuppositions which might make a "world" where the original statement in parentheses could be "true" 


[quote] On 20060325 16:13, magicjohn2278 wrote: Daegs and Johnathon, what it all boils down to is; I give you a sum of money. If you wish,you may toss a coin  If it comes down heads, I double your money. If it comes down tails, you give me just half of what you have. Do you toss the coin? Is it worth taking the bet? [/quote] This is exactly the reasoning that leads to the paradox. In the above situation the pobability of coin coming down tails is 50%, while with envelopes it's not the case. This apparent similarity tricks us into admiting that the paradox exists, while it does not. In the above situation, a rational player should take the bet, but this situation is not identical to the one after choosing the first envelope. 


Ok going back to the "paradox" (Thanks Tomas) Let's say there are X in the chosen envelope. Then there is 50% chance that there is 2X in the other envelope and 50% chance that there is X/2 in the other envelope. The expected value (calculated according to the definition of expected value) is 0.5 * 2X + 0.5 * X/2 = 1.25X which means that you are better off switching and taking the other envelope. This can't be true so the puzzle is to find the error. That's "all". /Tomas Bad Maths! The Expected Value has to be your expected winnings LESS your expected losses. This is fine in the first half of the calculation, you start with X and end with 2X. But the second half is wrong. You are "winning" X/2 for a stake of 1X, which you don't get back, so the expected value of this bet is your "winnings" X/2 less your loss of X/2.  and let's face it if you bet 1 unit at 1:2 odds often enough, eventually you will end up with nothing! So the Expected Value is: (0.5 * 2X) + (0.5 *(X/2X/2)) = 1 How's that? 


Sergey, You say that if I hold an envelope with X dollars, it is not a 5050 bet that the other envelope contains less or more than X dollars. Why do you say that? You haven't explained why you think that. I don't agree with you on that point (unless you are referring to the consideration in my footnote 1 below). Here is what I think: Let's say the envelope you are holding contains 20 dollars. Then there are two possible scenarios: 1. One envelope contains 20 and the other contains 10 2. One envelope contains 20 and the other contains 40 That cannot be disputed, as it is stated in the original question. The thing is, you don't know which scenario actually applies. IF WE ASSUME (*** see footnote 1) that each scenario is equally likely then without question, it is equally as likely that the other envelope (the one you don't hold) contains 40, as it is that it contains 10. Therefore, if you swap, you stand to either gain 20 (going from 20 to 40) or lose 10 (going from 20 to 10) with equal probability. If we define a good bet as being one in which you stand to gain more (if you win the bet) than you lose (if you lose the bet), then clearly in this case it is a good bet to swap envelopes. So you swap, and now hold the other envelope. You know that this contains either 10 or 40, and that both are equally likely (which is true). And you know that the other envelope contains 20. You are now invited to swap again. If you swap, you stand to either gain 10 (going from 10 to 20) or lose 20 (going from 40 to 20) with equal probability. In swapping, you would therefore stand to lose more than you would win, so clearly it is a good bet to KEEP the envelope you now have. So we know that the best thing to do is to choose one envelope and then swap ONCE. But because it is just as likely that you would initially choose one envelope as the other, this strategy could leave you holding EITHER envelope after the swap (+++ see footnote 2). Therefore, what we are really saying is that it is no better to choose one envelope or the other, and no better (given the opportunity) to swap or not. I believe this solves the apparent paradox (at least, in nonmathematical language). ==== *** footnote 1: As we know that there is not an unlimited amount of money in the world, then clearly it is not strictly speaking always true to say that it is equally likely that the other envelope contains more money or less money than the one you hold. e.g. If you know the one you hold contains $1000,000,000,000,000,000,000 (or some appropriately huge sum) this may affect the probability that the other contains twice that amount. +++ footnote 2: Let me clarify that point: Let's say the envelope you initially choose contains 10 dollars. BECAUSE YOU DO NOT KNOW WHAT THE AMOUNTS INVOLVED ARE, there are two possible scenarios: 1. One envelope contains 10 and the other contains 5 2. One envelope contains 10 and the other contains 20 ... and therefore all the above argument applies equally as well to this situation, as it does to the 10, 20, 40 situation. In the above, I have used the figures 20, 40 and 10 for clarity. We could just as easily use X, 2X and X/2. The argument holds for any value of X. 


Steve the problem with defining the flaw in the logic is trying to define everything in relation to the envelope you have (X). In the cases where you swop, in half of them X is the "bigger value" and in the other half, X is the lesser. the only time you win, is when X is the lesser, and you win another X. on the other hand, you lose only in the cases where X is the greater value, and here you only lose "half of X"  but in these cases X is worth double what it was in the first case, so you are actually losing exactly the same! ...but try expressing that as a mathematical equasion! 


For some reason I've bee having images of Shrodinger's cat hiding in one envelope and Alice's Cheshire Cat hiding in the other. Is that about right? 


... no, there is a male dog in one and a dog of indeterminate sex in the other! 


All that is fine and dandy, but please explain these two statements then: 1: Your envelope contains X. By switching you either lose X/2 or gain X, so the amount you gain is greater than the amount you lose. 2: The amounts in the Envelopes are Y and 2Y, by switching you either lose Y or gain Y, so the amount you gain is the same as the amount you lose. They both have true premises, yet have contradictory results.... a paradox. 


Daegs, Well, exactly  that is indeed the paradox. My post above is an attempt to explain it in terms of what is really going on in the process of selecting and swapping envelopes. It demonstrates that, despite the apparent paradox, there is no advantage to choosing either envelope, or choosing to swap or not (which, of course, we all know instinctively anyway). But as I say, it is a nonmathematical explanation. I believe the mathematical explanation goes rather deeper  way beyond our considerations (so far) of X's and Y's. 


Daegs, Statement 2 is almost correct. 2: The amounts in the Envelopes are Y and 2Y, by switching the amount that you GAIN would be: (Y2Y)=Y or (2YY)=Y You already have one envelope containing either Y or 2Y and have to give it away to get the other. If you give 2Y your "gain" is 1 so you have 1Y ... if you give 1Y your gain is 1Y so you have 2Y. (As you said, the amount you gain is equal to the amount that you lose.) Statement 1 is only half way there. 1: Your envelope contains X. By switching you either lose X/2 or gain X, so the amount you gain is greater than the amount you lose. Wrong! The amount you gain (profit) is the same as the amount you lose! If you switch… It costs the contents of your envelope to play –1X Half the time you win…. How much? Your winnings are 2X… but… But you still owe the contents of YOUR envelope to play (1X) … so your expected profit is 1X If you switch… It costs your envelope to play –1X Half the time you lose…. How much? Your “winnings” are half of X … but….. If you have 1X and win half of X, you have also LOST half of X. (You started with 1X and now have X/2 – winning half of X AND losing half of X is the same as not winning or losing anything… AND you haven’t yet paid to play! (1X) …. so your expected profit = X/2 (win)X/2 (loss)  1X (bet) = 1X So the expected gain by switching is 0 ! 


John  your maths is all over the place :) If I give you £1 to keep and tell you you can then swap it for the coin in my hand (which you have not seen) which I tell you is either £2 or 50p... If you swap and I had a 50p coin, you LOSE 50p. If you swap and I had a £2 coin, you GAIN £1. As each of the above is equally likely, your EXPECTED gain in pence is (100 + (50)) /2 = 25p This is not the same as an expected gain of zero, as you claim. 


Ok... so let's say there are 3 envelopes, $50, $100 and $150.(this obviously is a worse deal than $50, $100, $200). Would not the offer to switch here (to $150 or $50) be even? By your own formula's: $150  $0  $100 = $50 $50  $50  $100 = $100 Using your formula that means that a $100 to $50 or $150 switch is a actually unfavorable, but its simple to see that: If you take $100 away, you start with 0 and have $50 and $50. We all know if you flip a coin for even money, you'll come out even in the long run, but by your math it says your a loser.... I have to disagree with what you've posted, it is illogical! 


[quote] On 20060327 15:27, Steve Martin wrote: John  your maths is all over the place :) (What mine too!?) If you swap and I had a 50p coin, you LOSE 50p. If you swap and I had a £2 coin, you GAIN £1. As each of the above is equally likely, your EXPECTED gain in pence is (100 + (50)) /2 = 25p [/quote] You have to offset my expected gain, against my expected loss. Have a pound, win £2 = £1 gain. ..But then what happens? With your calculation, there is no way you can describe the 50% chance of me ending up with half what I started off with as a GAIN! If that was my gain, what was my loss? 


Gains should not be put in the same pot of stew as losses. That's what screws up the logic. Andrei 


John, The expected gain (as I said) is: (100 + (50))/2 The 100 is a positive gain. The 50 is a negative gain (i.e. a loss). There is no problem is describing them both as gains  it is just that one of them is a negative gain. 


If I place £1 bet that pays 1/2 my stake every time I win, I am going to win 50p every time I win. ...AND I am going to lose 50p every time I win. So my expectation of ever making a profit has to be zero. So I have a £1 coin... My PROFIT or LOSS if I swap... You have a £2 coin, I GET £2 and LOSE my original £1. (+£1 PROFIT) You have a 50p coin, I GET 50p and LOSE 50p and LOSE my original £1. (£1 LOSS) As each of the above is equally likely, my EXPECTED gain in pence is (100 + (5050100)) /2 = 0p ...Comments...? 


"You have a £2 coin, I GET £2 and LOSE my original £1. (+£1 PROFIT)" Yes. "You have a 50p coin, I GET 50p and LOSE 50p and LOSE my £1. (£1 LOSS)" No. Why are you saying "and LOSE 50p" in the second statement? All you are getting is 50p, and all you are losing is £1. That's a net loss of 50p. 


If you bet £1 on the toss of my doubleheaded coin and I give you 50p on heads but take your stake,  you are in a winwin situation, every time you play you win 50p, but what is your expectation of making a profit? 


Why do I feel like I have to type my post 3 times before someone address them lol... Would you all agree that if you pay $1 to win $1.50, or win $.50, you are getting even odds? I give him a dollar and half the time I get $1.50 back, and half the time I get $.50 back... So over time I'll even out and I'm basically paying $1 to get $1. So can we agree that that case is even money? So if that is even money, then how can the same bet only earning $.50 MORE on the win($2 payed instead of $1.50) is not ABOVE even money and a good bet? The only way you can prove that a $1> $.50 or $2.00 scenario is equal, would be to prove that either it is the same as a $1> $.50/$1.50 scenario or prove that the $1> $.50/$1.50 scenario is actually a losing proposition.... 


Does it help to imagine this as done with three boxes. One has a quarter, one a half dollar and one a silver dollar. Just before we start, we take two of the three boxes and give them to the host. The host enters the room and greets the guest with two boxes and an offer. The guest accepts one box as a gift, then the host makes the announcement about the possible relative values of the contents of the boxes and offers an opportunity to switch boxes. Does this help vanish the paradox for you? 


I agree with tomas, the paradox isn't in that its really a 50/50... its the fact that you can state in multiple ways how its NOT 50/50 with completely true statements and logic... 


Daegs  yes, you and I are in agreement. Your post at 9.22pm on 27th is fine. It agrees with what I have said, and what Tomas said on page 1 of this thread. In the original problem, as posed, it can be shown that to swap IS a good bet because the expected gain is 0.25X (where X is what you originally hold). (John is trying to convince us, in a very strange way, that it is an even bet... I simply do not agree, mathematically, with what he is saying as he is counting the loss of 50p twice in his calculation.) As we have already said several times, the apparent paradox is that although swapping can legitimately be shown to be a good bet, we instinctively know that to do so is nonsensical since we could have chosen either envelope in the first place. 


Daegs… I haven’t been ignoring your post, I’ve been try to come up with a convincing argument that a ( – X/2 profit) is a win of ½ and a loss of ½ at the same time with a net return of 0… (Still thinking about it.) Your scenario with three envelopes is fair enough, and your argument seems sound. But, in that scenario, I know that the difference between the envelopes is $.50, and I know that I hold $1 so switching will indeed be a even bet. In the two envelope scenario, even though it appears to be more favourable, it isn’t (as we all know), the difference is that the only time that I can gain anything, is when I hold the lower value envelope (and I am going to gain the value of the lower envelope), and the only time I lose, is when I hold the higher. (and I am going to lose the value of the lower envelope.) – An even bet. But this doesn’t wash with the “lose half or double your envelope “paradox”” .. and it makes it very difficult to relate your expected win or loss the contents of your envelope “X”. 


Steve Martin, it's not just 'instinctively' that we can make the clear, definite statement, that a swap brings in no extra profits, in the long run. The question here is why does our APPARENTLY sound logic lead us to the conclusion that a swap IS profitable. I have the answer, but, while it's not sophisticated, it is difficult to word. English is my second language. I will do my best to word it properly and then I will post it. Andrei 


"The question here is why does our [b]apparently[/b] sound logic lead us to the conclusion that a swap [b]is[/b] profitable." Absolutely. As I said, I have given an analysis of it in plain English, rather than using mathematical language. The paradox is a paradox even if you only count our instinctive knowledge (along with the expected mathematical gain). As I (and others) have said before, there is a deeper mathematical treatment possible to address the issue. None of us has gone anywhere near it. There are many web pages that address it. I look forward to hearing your thoughts. Posted: Mar 28, 2006 8:10am  John, You say: "I’ve been try to come up with a convincing argument that a ( – X/2 profit) is a win of ½ and a loss of ½ at the same time with a net return of 0" How can a negative profit possibly be the same as a breakeven situation? If you can prove that, please come and do my accounts for me. ;) 


Hi Steve…. The problem that I am having is disproving Tomas’s original argument that your expected return by switching is (0.5 * 2X) + (0.5 * X/2) = 1.25X . So it is favourable to switch. There is clearly a mistake in the calculation. The premise is that 50% of the time, when you switch for the X/2 envelope, you will show a gain of 0.25X Now I could just argue that Tomas made a mistake i.e. his calculation should have shown that switching for the X/2 envelope should be represented as a potential loss and so should have been (0.5 * 2X) + (0.5 * –X/2) = 0.75X …. Which only creates another paradox in that now I have proved that it is unfavourable to switch! So which calculation is right? (The answer is obviously neither! Or possibly a bit of both!) So I think that there is an argument (somewhere) for saying that switching X for X/2 (50% of the time) results in a gain of 0.25X and a loss of –0.25X at the same time! (Showing a net gain of 0.) (If this can be demonstrated, it eliminates the paradox.) … but I’m finding it rather hard to justify!! 


There is nothing wrong (and no mistake) with the maths of the equation: (0.5 * 2X) + (0.5 * X/2) = 1.25X which implies that it is favourable to switch. Solving the paradox is not about disproving that this equation holds. 


And just to be clear. The above is the expected value of the other envelope calculated according to the definition of expected value. If you want the expected value of your _gain_ it is expressed as: 0.5 * (2XX) + 0.5 * (X/2X) = 0.25X which is positive. This of course is not a correct reasoning either (but the expression is absolutely correct IF there is a 50/50 chance of 2X and X/2 in the other envelope) so the puzzle is to find what's wrong in the reasoning. There was some good explanations in the old Two Envelopes thread. /Tomas 


The reasoning derails because even if you STICK with your choices, all the time, you will still have a 1.25X profit, because X is not a constant, but a probabilisticallydetermined variable. Totally counterintuitive. I'm working on a writtenout explanation. I might have to use some advanced maths for it, contrary to my initial beliefs. Andrei EDIT: Or, better said, that you will always have an X profit, even if you switch. Calculating the average for X/2+2*X is mathematically unfounded in this particular case, because the X is always changing. 


Andrei...you open your envelope and find X in it. Let's say you open and find 100 dollars. Put that instead of X in the expressions for the expected value of the gain or the expected value of the other envelope. Thanks John for suggesting that you actually check what X is. It makes the problem so much better. /Tomas 


[quote] On 20060328 11:10, TomasB wrote: It makes the problem so much better. [/quote] ... I think you mean "It makes the problem much more difficult!" 


I'll try to keep this as mathsfree as possible. Keep in mind that this is a simplified version of the whole thing, but studying it carefully MIGHT help you realize that there is really no paradox to begin with, because the premise, and deductions, are poorly made. A guy walks into a room and sees two envelopes. He knows how the game works, and he thinks that if he switches all the time, he'll beat it. So he'll switch all the time. For simplicity's sake, I will assume that the envelopes contain either 50 and 100, or 100 and 200. You will see why in a second. Further, we must assume that the guy's choice will be perfectly random. He will not bias one or the other of his sides (left, right). Let us also assume that he is given 50 picks from the 50/100 combination and 50 picks from the 100/200 combination. Again, this would not be the case in real life, but to simulate real life we'd need a number of variables equal to double the number of picks, which would be hard to put succintly. Chances are, from his first 50 picks, he will pick the 50 envelope 25 times, and the 100 envelope 25 times. From his next 50 picks, he will pick the 100 envelope 25 times, and the 200 envelope 25 times. Proportionally, we have: 50  25 picks 100  50 picks 200  25 picks But, let's not forget that he SWITCHED THEM ALL. 50 turns into 100. 200 also turns into 100. Half of his 100s turn to 50s and half turn to 200s. so 50 > 100  25 picks 100 > 50  25 picks 100 > 200  25 picks 200 > 100  25 picks By adding them all up, we come right back down to our initial distribution, the only difference is that they've been switched around more, so the guy wasted more time, but ended up with the same money. If you're thinking "Well, that's all pokey. But where's the error in the 1.25X statement?". The short answer is that if you assume that X will always be the same, and calculate your average this way, then you are assuming that the guy will always pick the envelope with the same amount. You're turning a random choice into a choice which biases a specific result. (the reason for that, and the diabolic nature of this little puzzle, is that you're biasing it with the intention of making it easier to calculate) I hope I was somewhat clear. Let me know if there's anything I should expand on. Andrei 


Andrei  it seems to me to be the perfect answer. Thanks! 


That's not really explaining where the error in the paradox is, since John's suggestion is valid. Let's say you open the envelope en find 100 dollars. You do _not_ find 200 dollars and you do _not_ find 50 dollars. Let's focus on only the times you play and open the envelope and find 100 dollars. Will there be a 50/50 shot of there being 50 dollars or 200 dollars in the other envelope? John simulations seemed to show that it actually is, and that has the strange result that the expected value of the other envelope is 125 dollars. Again, I _know_ how to show that there should not matter if you switch, but I can't put my finger on the error in the above reasoning. Let's apply this to the scenario Andrei wrote of. We focus ONLY on the times X=100. 25 of the cases we switch and find 50 according to him. In 25 of those cases we switch and find 200. The average of all those 50 trials when we actually find 100 will be (25*50 + 25*200)/50 = 125. /Tomas 


Like I said, if you're picking the times where you find 100 dollars in it, then you can't go ahead and stat it out over extended periods of time. The mistake is that you focus ONLY on the timex X=100, because for those times, sure, you're better off switching. But, if the picks are random, you have to take into account what happens when you ALSO pick the other alternatives. Otherwise, like I said, the choice is not 'random' any more. Andrei 


But andrei, what if we don't *know* that 50, 100 and 200 are the values? That is a key element to the problem we don't know what is in the envelopes except that they are double/half themselves. So if we open up that envelope with 50, it could be 25 or 100, as if we open up 200, we could have 100 or 400.... The key element to the problem is that you don't know whats in the other envelope. 


If the values are all different, then the math becomes very complex but blurts out the same result at the end. Remember, 'we' are gods in our little experiment, and while we know what the values are, THE PLAYER DOESN'T. Again, we know because we were keeping track, and we are omniscient, but the player didn't, and he exhibited perfectly rational behavior, i.e. either all switch or all stay. Andrei 


Just musing... the opposite of doubling something is halving it. ... the opposite of halving something is to double it.... so why aren't 2*X and X/2 equal and opposite? .. beats me! 


[quote] On 20060328 13:05, Andrei wrote: ..the timex X=100, because for those times, sure, you're better off switching.[/quote] Ok, this is interesting. I just got you to agree that the times we find 100 dollars in the envelope we are better off switching. Read my post again that made you say that and read 44 dollars instead of 100. All of a sudden you will agree that the times we find 44 dollars in the envelope we should switch. Read that post again but imagine that I wrote 230 dollars instead of 100 dollars. You will now agree that we gain by switching the times we find 230 dollars in the envelope. I find it really hard to point out the error in the reasoning once you have agreed that we gain by switching the times we find 100 dollars. Once you agree to that you are trapped in the reasoning that tells you to switch whatever you find in the envelope. It's strange indeed. /Tomas 


Aha! the monkey holds the coconut! The monkey is trapped trying to compute based upon gains and losses based upon an unknown and random variable. this monkey lets go, he has nothing. Starting the story again, he takes an envelope and has on average (1.5)X meaning half the trials, x and half the trials 2x. But wait, what about exchanging envelopes? Well, the monkey puts down the envelope so again has nothing. Then gets either x or 2x again. This monkey is satisfied and is going for a banana. :) 


Thomas, Again, you're better of switching if you hold 100 and you KNOW that the other envelope has 200, which never occurs in the real scenario because you never actually know what's in the other envelope. That was the error I was trying to point out. So, in other words, you must generate a plausible scenario where you distribute the picks properly. If you don't, and you say, "let's analyze only the X=100 scenario" you're making the mistake I'm about to (intentionally) make with the following analogy: I want to prove that when you flip a coin and catch it, you can ALWAYS get it to fall tails if you flip it round once more (just one full rotation). To analyze if this is true or not, I will only look at the cases where it falls heads. Obviously, by turning it round, it will fall tails. Thus, my hypothesis has been proven. Obviously, it hasn't, but unfortunately, in the 2 envelopes mystery, it is not as "obvious" that the very same mistake is being made. Whenever you choose to fix one value, you are anulling the 'randomness'. If you say "then assume it's 44, then assume it's 233" well then that's okay, but you're just giving out a predetermined set of fixed values, upon which you CANNOT calculate the (x/2+2x)/2 average, because there exists no average to speak of, as you've selected no variables which have random probabilities. You're just giving out a set of constants. Andrei 


Andrei, it is ok to speak of expected values and conditional probabilities in probability theory. Do you agree with that if you find 100 in the envelope you will have 50 in the other envelope half of the time and 200 the other half of the time? You did state that yourself as that happened 25 times each in 50 trials. /Tomas 


ThomasB  I hope I can be clearer now. Let's assume you play 100 rounds of this game, and you find 100 in your envelope every time. Yes, half the time the other envelope will be 50, half the time the other envelope will be 200. However, and here is the absolute crucial point, which explains the whole apparent paradox (please try to grasp what I'm saying, and I know it will take some effort because I feel myself limited when it comes to claryfying my point, and I know it doesn't come through perfectly): If you play 100 rounds of this game with those types of envelopes, you will NOT always draw the one with 100 in it. That is what explains everything. You are generalizing an ungeneralizable situation. Andrei 


Andrei  I think I understand what you are saying. All along we have been saying, if you find X in the envelope, then the other envelope contains either 2X or X/2. Well, that is true... but the fact is it can't be both  it has to be one or the other depending on the actual state of affairs. VIEW A  If the envelope hold contains $100, then there are two possibilities: 1. $200 and $100 2. $100 and $50 The "X" you are talking about having found, is in fact, in case 1: $100 (when it could equally have been $200), and in case 2: $100 (when it could equally have been $50). VIEW B  Now... supposing we hold an envelope with $100 in it. By the same logic as above, there COULD have been two possible sets of envelopes as follows: 1. $100 and $50 2. $50 and $25 ... and so if that were the case, we could NOT say that it was equally likely that the other envelope contains $50 as $200, since $200 actually plays no part in the scenario. All we can say is that on this occasion, we happened to choose the envelope that contained the larger amount. If we had chosen one with $50 in it (i.e. the "middle" value of the three values), we would have been in an equivalent situation to the one described in VIEW A. What the above means is that, having chosen an envelope, you cannot automatically include in your subsequent deliberations a pair of envelopes that does NOT, in fact, apply (which is what we HAVE been doing in that equation for the expected gain). 


[quote] On 20060328 15:44, Andrei wrote: ThomasB  I hope I can be clearer now. Let's assume you play 100 rounds of this game, and you find 100 in your envelope every time. Yes, half the time the other envelope will be 50, half the time the other envelope will be 200. However, and here is the absolute crucial point, which explains the whole apparent paradox (please try to grasp what I'm saying, and I know it will take some effort because I feel myself limited when it comes to claryfying my point, and I know it doesn't come through perfectly): If you play 100 rounds of this game with those types of envelopes, you will NOT always draw the one with 100 in it. That is what explains everything. You are generalizing an ungeneralizable situation. Andrei [/quote] So let's say that you play the game N number of times where N is a huge number after which you have found 100 dollars in your envelope exactly 100 times. Will you have found 200 dollars in the other envelope 50 of those times? John's simulations showed that that is the case. So in probabilistic language: The conditional probability, given that 100 dollars is found, of the other envelope having 200 dollars is 0.5 and the conditional probability of the second envelope having 50 dollars is 0.5. Please write your own program to verify if that is the case. If that shows to be true, then I think you can change the parameters of the program to only count the times you find 50 dollars in the envelope. You will probably find that 50% of those times you find 25 dollars in the other and the other 50 times you find 100 dollars in the other envelope. I'll repeat John's numbers: "10127447 trials gave 10149 envelopes with 100 in, and in 5069 of these cases the other envelope contained the 200." That _actually_ means that if he never switched the times he got 100, his average would be 100. If he always switched his 100 his average would be 124.92 dollars. Anyone else care to simulate? ;) Since I know the definitions of expected value and conditional probability I can't see any flaw in the way I'm using them, yet I know there has to be a flaw. But what is it? Wonders, /Tomas 


I _THINK_ what's going on is this: if we allow the initial probabability distribution to be such that all positive values are equally probable, ie, an unbounded flat distribution, then, near as I can tell, for any envelope, prior to peeking in it, the expected value is infinite. But that's greater than any possible expected value. When you actually peek inside of one envelope, you see a finite value, and then that shifts around the probabilities for the second envelope to only two possibilities. I _think_, in these circumstances, maaaaaaaaybe, switching envelopes may actually be the correct thing and nonparadoxical. Maybe. Given a bounded, or otherwise well behaved prior distribution, ("Well behaved" being defined here as "The expected value is not larger than every possible value you could get") I suspect that the wacky paradoxy stuff goes byebye. Actually, a flat unbounded distribution is pretty weird anyways, would be infintesimal everywhere. Probably could only be defined in terms of limits anyways. (Sort of like an antidiracdelta) 


Tomas, Can you calculate the expected value like this? Half the time the total value of the envelopes (T) is 3X and you hold X ....(1/3 T) Half the time the total value of the envelopes (t) is 1.5X and you hold X ....(2/3 t) So: Expected gain 0.5 * (2/3T  1/3T) = 1/6 of the total value plus 0.5 * (1/3t  2/3t) = 1/6 of the total value .. so your expected gain is 0 ....no, I didn't think so! The problen is, we are using 3 envelopes and there are only 2! Your expected gain should be 0.5 * (2XX) = 0.5X ...when you had the smaller envelope plus 0.5 * (X2X) = 0.5X ..when you had the larger! .. which equals 0. 


To all: The problem is not to prove that its 50/50, the problem is to [b]disprove[/b] the calcs that show its not... By the way, I just thought of a deviously evil problem: 1: You are given two envelopes and told one is double the value of the other. 2: You are allowed to open one and see the value. 3: You are allowed to switch [b]and[/b] open that envelope to see if you got the bigger or smaller. 4: The first Envelope is then replaced with another that you are told contains either double or half of the value in your current envelope. 5: Repeat from step 3. What is the best strategy in [b]this[/b] game? (I believe same paradox applies, except that while the low end will only approach 0, the high end approaches infinity so you are always better switching... This [b]combines[/b] the St. Petersburg puzzle with the envelope problem... very very evil yet without the problem of zero'ing right away with Petersberg, whereas this is gradual and changes things). Intersting since we know if it truly IS 50/50, than a typical run might go. WLWL and thus 200100200100 or LWWL and thus 50100200100 WLLW and thus 20010050100 So it's obvious that its 50/50, yet the chance at growing exponetially vs. only dropping by lower and lower amounts seems to make switching at all times a better strategy... yet it should be 50/50? 


Daegs, on page 1 you already disproved the equation that is in the second post in the topic (an equation that TomasB pointed out was invalid in that post). There is another way to refute that equation that applies to all the other problems here. The faulty logic is excluding possiblities. This is done by using a choice to start in the middle of a problem. This is not valid in itself. One has to consider the probabilities related to that choice in the context of the entire problem. In other word, possibilities are not properly enumerated because of starting in a predetermined state. For the envelopes, once a choice is made, while there is a 50% chance of ending in one of two states, there are four sequences to get to any end state. The choice makes it easy to miss that there are four possibilites, not two or three. These are: 1. You choose envelope A (X), you do not switch obtaining X. 2. You choose envelope A (X), you switch obtaining 2X. 3. You choose envelope B (2X), you do not switch obtaining 2X. 4. You choose envelope B (2X), you switch obtaining X. Thus, the equation in the second post in the topic is incorrect because it posits two possibilites. There are four. TomasB wrote: [quote] I'll repeat John's numbers: "10127447 trials gave 10149 envelopes with 100 in, and in 5069 of these cases the other envelope contained the 200." That _actually_ means that if he never switched the times he got 100, his average would be 100. If he always switched his 100 his average would be 124.92 dollars. Anyone else care to simulate? ;) Since I know the definitions of expected value and conditional probability I can't see any flaw in the way I'm using them, yet I know there has to be a flaw. But what is it? [/quote] The presumption that if he never switched, he'd get 100 isn't correct. One would expect that half the time he would choose the envelope with 200 before not switching. (Note, the presumption is no knowledge of amounts retained from trial to trial). Or, it's that there are four possibilities, not two or three. If we presume a perfectly rational subject, they'll realize that the two choices they have are arbitrary. Thus each have an equal probability of 0.25 = 0.5 times 0.5. 1. 0.25 * X 2. 0.25 * 2X 3. 0.25 * 2X 4. 0.25 * X  = (0.25 * 6)X Thus the expected value for gain is 1.5 X. Also, the expected value is the same whether the subject switches envelopes or not. This makes sense, since it is the average of X and 2X. Daegs, your last problem could be solved by enumerating all the possibilities. Most (all?) probability paradoxes I've seen use various tricks to hide some possibilities. 


[quote] On 20060328 22:32, Bill Hallahan wrote: The presumption that if he never switched, he'd get 100 isn't correct.[/quote]If you open the envelope and find 100 and do _not_ switch, surely you can have nothing but 100? That's what the simulation was all about. You only studied the cases where the random envelope you picked actually had 100 in it because you opened and checked it. John, expected value is the sum of all possible outcomes weighed by their respective probabilities of happening. That's the definition. To calculate it you first need to decide _what_ you want to calculate the expected value of. You can't just say "expected value" but you should say "the expected value of the other envelope" or maybe "the expected gain if you switch" or "expected sum of both envelopes". Otherwise it's impossible to know what you are talking about and trying to calculate. Still, it's not about finding a way to show that it should not matter if you switch, you should find _what_ the flaw in the reasoning that tells you to switch is, _not_ find that it _is_ flawed because we already know that. /Tomas 


[quote]Daegs, your last problem could be solved by enumerating all the possibilities. [/quote] Actually you'll run yourself into a corner if you try that... It's basically the St. Petersburg problem only more evil...(St. Petersburg basically states in a game of flipping a coin, you lose on tails but your winnings are double if you get heads... the math works out such that since it goes to infinity for all heads, then you should bet [b]any[/b] amount of money on the game no matter how slim the chance is that you win, due to never being able to spend infinity, when you can gain infinity) it requires some math to prove otherwise. I'd wager that in this hybrid, this is even eviler, since you are still going to infinity but as you go lower, you only lose fractions of cents and never reach 0.... so that it's even better to switch. So just on a 50/50 chance alone it would be in your best interest to switch, however due to the envelope problem it becomes that even moreso(as you could easily see that losing twice from $100 is only $25 but winning is $400). As far as simulating... I'd need a better computer as I get wildly varying results.. Currently with the average of 10k runs with 100 switches each: If you run the same simulation with 1/2 the time getting .5, and 1/2 the time getting double, then your ranges of winning jump to around $95  $3,355,549,005(the highest average of 10,000 runs of 100 switches I saw). most of the time its a couple hundred thousand or a couple mil. If you change the number of switches to 10 (with the half/double structure) you get a couple of hundred on average(obviously this lowers the range). Well accourding to the simulation, it's practically always a good chance to switch... in fact I'd pay a couple thousand just to play. :) 


I've find a way to point out the _exact_ flaw. It is reasonable to assume that the envelopes are filled as follows. In one envelope an amount Y is put which is squarely distributed from 0 to Z. We do not know Z, but enough is to know that such an upper boundary must exist. In the other envelope Y/2 is always put. Statements: 1. If I open my envelope and find X there is either X/2 or 2X in the other envelope. (That is absolutely true.) 2. There is a 50/50 chance of each of these two cases. (Not true. THAT IS THE FLAW. The probabilities are dependant on X in relation to Z.) So what is the breakpoint for when the probability goes from 50/50 to 100/0? At X > Z/2 there is probability 1 of the other envelope having X/2 and probability 0 of the other envelope having 2X. At X <= Z/2 we have the 50/50 case. So what is P(X <= Z/2) / P(X > Z/2)? There is only ONE way to get to each X that is bigger than Z/2 while there are double as many ways to get to an X that is less than or equal to Z/2 since you could get it in an envelope either by the selected Y being X or the selected Y being 2X. Twice as many ways. That means that the relative probability to have X <= Z/2 is 2/3 while the relative probability of X > Z/2 is only 1/3. Now we are ready to write the expected value of the other envelope when we find X in the envelope we selected: 2/3 * (0.5 * X/2 + 0.5 * 2X) + 1/3 * (1 * X/2 + 0 * 2X) = X That expression clearly shows the flaw in assuming 0.5 probability of finding X/2 and 2X in the other envelope. Note that I have not specified the boundary Z to get this result  I have only said that it exists. *sigh of relief* /Tomas 


Thomas, nicely put. That's precisely it, much clearer than my own explanation. Andrei 


A bieffect is that _if_ you can estimate Z (the highest possible amount he is willing to put in an envelope) and you discover something less than half of that estimation in your envelope your expected gain is positive if you switch to the other envelope. /Tomas 


Yeah. The initial problem set no such limits, though. Andrei 


Ah but can this tackle the problem of identically true statements and conflicting conclusions(thus there still is a paradox). refering to of course:[quote] On 20060326 16:19, Daegs wrote: 1: Your envelope contains X. By switching you either lose X/2 or gain X, so the amount you gain is greater than the amount you lose. 2: The amounts in the Envelopes are Y and 2Y, by switching you either lose Y or gain Y, so the amount you gain is the same as the amount you lose. [/quote] OTOH, it seems that there isn't a boundry in effect... if we can pick *any* number and know there is a possiblity of 2x in that envelope then that really means up to infinity, It seems that assuming there is a boundry can help things, but there is still the problem of the statements or if you try to tackle it unbounded... also, it seems that you are saying it IS 50/50 as long as X <= Z/2... so why not restrict the values of X to within Z/2, would that not force it to go back to 50/50 and thus the paradox?(or would Z then become Z/2 as we are restricting choices and thus screw us up?) anyways its way too late and I've been working on mind problems all day(unrelated to this) so I'll look at your solution in depth tommorw... but still I'd like to see this applied to the statement part because that has me really stumped.(or explain why all my averages on the sim of constantly changing the new envelope don't even out but instead see a great increase). 


So the logic and maths shows us that it is more favourable to switch. The question is why and what’s wrong? What it boils down to, is we are confusing numbers, “things” and functions. And using the wrong scale to evaluate the potential gain or loss. We usually see numbers and things on a linear scale. 5..4..3..2..1..0..1..2..3..4..5..6..7..8..9..10..11….. The numbers really have no significance! They are just there to identify the points on the scale, it is the spaces between the points that we are interested in. If we are at point 2 and we move (add) 4 spaces, we end up at point 6. The spaces between the numbers are equal and have a value of 1. To move right, we add 1 to move left we subtract 1. But to solve this puzzle we should be looking at rises and falls on an exponential scale 1/8..1/4..1/2..1..2..4..8..16..32..64….. The spaces between the numbers are equal and have a value of 2X to move right and 1/2X to move left. Looking at our exponential scale, getting double your original number is exactly the opposite of losing half. And the loss is equal to the gain. (Remember we are counting spaces to evaluate potential loss or gain.) Obviously, if you have X and double it you now have 2X, if you have 2X and half it then you have X, so your gain by doubling is identical to your loss by halving. – Your starting point on the scale must be the point that you are at at the time. (Conversely, if we have X and half it, we have X/2, if we double X/2 we have X again, loss equal to gain.) So in our problem if we switch and double, our potential loss is the same as switching and getting half. So there is no advantage in switching. “Ahh…”, I hear you say.. “that’s rubbish! Even on your exponential scale, if you have 2 and double you have 4, if you half you have 1, how can losing 1 be the same as gaining 2?” But that is looking at numbers linearly. On the exponential scale the difference between 2 and 1 is the same as the difference between 2 and 4. We should be looking at the spacing of the numbers, not the numbers themselves. (The numbers are there only to identify the points. (Perhaps I should have used letters?)) So why doesn’t it work with numbers on a linear scale? Because it’s not a linear question, it’s an exponential one. We know the value of X (our starting point) so why isn’t the difference between losing half and gaining 1 the same. (It is on our exponential scale spaces.) We (I) can’t convert between points on our exponential scale, to points on our usual linear scale. They are two different animals! Here is a choice that we think we can evaluate on our linear scale: You have X, you can switch for X+1 or X1 is it favourable to switch? Obviously it doesn’t matter. Using our linear scale, potential loss and gain is equal. But having decided not to bother, I now tell you that the things are cubes of gold with a side length of X… So how is it that you can now “prove” mathematically, (erroneously) using your linear scale, that it is more favourable to switch. What changed? If the X ’s are cubes, you should be using a N^3 scale. So, the “solution” to the paradox, is that we can prove that it is more favourable to switch, only by using an incorrect basis for the maths. I anxiously await your scorn! 


If it was unclear although I implied it twice: limes[Z>infinity](P(X <= Z/2) / P(X > Z/2)) = 2 since P(X <= Z/2) / P(X > Z/2) = 2 for ALL Z > 0. The error in "1: Your envelope contains X. By switching you either lose X/2 or gain X, so the amount you gain is greater than the amount you lose." is in the wording "so the". It just doesn't follow from "By switching you either lose X/2 or gain X". You have to weigh both those cases by their probabilities of happening. If finding X/2 has a probability of 0.9 and finding 2X has a probability of 0.1, the expected value of that envelope would be 0.9 * X/2 + 0.1 * 2X = 0.65X so that clearly shows that to answer if you gain on an average by switching you _need_ to state the actual probabilities for ALL outcomes. /Tomas 


.. but Tomas, the probability of switching for more is exactly equal to the probability that you hold the lesser envelope... which in turn is exactly equal to the probability that you hold the greater. To get anywhere on this tack you need to prove that your original selection was not a 50/50 deal.... and I don't think you can. 


Not sure what you mean, John. The original selection is not a 50/50 deal. There are so many possible amounts that can be in that envelope before it is checked. But if you mean that there is a 50/50 deal of picking either envelope, that of course has to be assumed and that's not a bad assumption. The choice _is_ random and you do not favour picking any envelope over the other. /Tomas 


The probability of picking either envelope A or B is 0.5 each. You have a 50/50 chance of holding the greater or lesser amount. Therefore the probability that the other envelope contains more is 0.5 and the probability that it contains less is 0.5 .. and you can't alter that! (Unless you can prove a tendancy to pick A on more or less occasions than B.) 


The probabilities you speak of, John, are _not_ equal. I showed in the earlier post that they are 2/3 for X/2 and 1/3 for 2X being in the other envelope and that was due to that the conditional probabilities were dependant on X. Of all possible games there are _not_ an equal probability of every possible value in the envelope. That's what does it. /Tomas 


Just been back to read that  won't pretend to understand it.. "It is reasonable to assume that the envelopes are filled as follows. In one envelope an amount Y is put which is squarely distributed from 0 to Z. We do not know Z, but enough is to know that such an upper boundary must exist. In the other envelope Y/2 is always put." However, why is it reasonable to assume that the envelopes are filled as follows...? It is just as reasonable to assume that Y is put in one envelope and 2Y is put in the other. (with Y being Z/3) Which should disprove your argument. (I can't disprove it because I don't understand it!) 


.... and what happens to your argument if we limit Z to say $300.... ... or stipulate that one envelope always contains $50....? 


Say envelope A contains the greater amount, B the lesser. If we accept that the total value of the envelopes is finite (and we do, neither can contain infinity) does that affect our chances of picking A over B? No. The contents of the envelopes are irrelevant when we come to pick them  we don't know what they are. Say the maximum total contents of the envelopes is $300 and for simplicity, that the contents in $ are always an integer. the chances of picking an envelope with $100 in is double the chances of picking one with $200 or $1. But in one case, the $100 is in envelope A and in the other in envelope B. And that is what we are trying to establish  the probability of picking A over B. For every possible A envelope, there is a corresponding B envelope. (containing half A) So for all the possible values of A, there is an identical number of possible values for B. Thus the chances of picking one or the other are 50/50 .......................................................... Which.. going back to my post about exponential scales raises an interesting question if A=2B and B=A/2 , both are integers and A+B is <= 300 how many possible values of A and B are there? B must be in the range 1 to 100 = 100 A must be in the range 2 to 200 = ? If you actually calculate the value of A given that it can be 2 to 200 you have to use an exponental scale of +2  use a linear scale and you would deduce that there could be 2002=180 possible values for A. There aren't, there are only 100. Values for A increase by 2 every time, not 1. 


John, the explanation that Tomas (and Andrei) has given is a bit of a tricky one to get one's head round, but I am confident that they are right. I have not fully "clocked" it yet, but I need to sit down and think about it. Aside from the maths involved, I think that ultimately this is the sort of thing that some sort of diagram will clarify. If we can create a diagram and some really simple language to describe it, then we'll have a better bet of grasping it (and  which is the proof of the pudding  explaining it to someone else). I'll come back when I've looked at it some more! 


I'll rephrase what you wrote a bit then you can put in 300 instead of Z if you want. Say the maximum total contents of the envelopes is Z. If one envelope has X > 2/3*Z the probability of the other having X/2 is 1 and the probability of it having 2X is 0. Otherwise the total contents would be above Z. If on the other hand either envelope has X <= 2/3*Z there is a 50/50 chance of there being X/2 and 2X in the other. So what is the relative probabilities of X being above and below 2/3*Z? Well that clearly divides it in 2/3 and 1/3. You will by all right argue that the X I speak of is in one envelope and not neccesarily in the envelope I pick. So let's say it's picked only half of the time. There will still be double as many below 2/3*Z as there are above since you cut both in half with that so the relative probabilities are still 2/3 and 1/3. Expected value of the other envelope with your resoning of filling the envelopes: 2/3 * (0.5 * X/2 + 0.5 * 2X) + 1/3 * (1 * X/2 + 0 * 2X) = X /Tomas 


Tomas  if Z is the maximum total of both envelopes, surely it is impossible for X to ever be *greater* than 2/3 of Z. Equal or less than, yes, but not greater. ?? Also, what happens if we say that the maximum total of both envelopes is infinty? i.e. each envelope contains a number written on a piece of paper, one of which is twice the other? Is that, then, a significantly different puzzle to the one we are considering (maybe it is)? ?? 


Ooops, I knew I should have stopped writing in this thread, Steve. ;) Before I can answer I need John to clarify how he'd write the simulation that makes sure that the total contents is below 300. It makes all the difference. As for what happens when Z approaches infinity I showed that, with the reasoning in my earlier post, the relative probabilities was a constant regardless of Z hence you can let Z slide up to infinity and the quota between those probabilities will still be 2. That's the only way I can see how there is not a 50/50 chance of there being X/2 or 2X in the other envelope on an average. /Tomas 


Also, we're using a simplistic arithmetic average, when the distribution is anything but simplistic. How about using a geometric average? That would certainly work, but is there any logic behind it? Andrei 


The simulation will be obliged to constrain both numbers so that neither is ever more than 2/3 of Z. I see what you mean about extending Z up to infinity using the same reasoning. 


I still think the error is using the wrong numeric "scale" to evaluate the potential gain or loss by switching. (Scale probably isn't the right word  there is probably a mathematical expression for it.) The linear scale you are using to argue that if you switch then you will either gain X or lose X/2 is the wrong scale. This scale only applies when values move up with addition and move down with subtraction i.e +Y or Y Unfortunately, the contents of our envelopes don't change using this scale. Their value increases by multiplication and decreases by division by a factor of 2. If you use the wrong (linear) scale to assess whether switching will improve or reduce your current position you will get the wrong answer. You are using the "Plus and Minus" scale and wrongly concluding that gaining 1 is better than losing ½ because 1 is more than ½. Which is wrong. If you start with 1, you switch and are lucky, so you gain 1, you have 2, switch again and lose half, you have 1, so once you showed a gain of 1, and once you showed a loss of ½ , yet you have the same as you started with! So illogical as it seems, the value of gaining 1 is identical to the loss of ½. The scale should be thinking about is "Multiplication and division by 2" and the question you should be asking is "what is the probability of doubling X by switching compared to the chances of halving it?" The answer is 0. 


It is probably more correct for me to ask, "If the value of X changes according to a function and you switch what are the chances of applying the function to X compared to the chances of applying the inverse function to X." 


John, I understand what you are saying, but the bottom line in all this is how much of an INCREASE (plus) or DECREASE (minus) you make with your money. When you double your money, you are simply adding its value to what you had. When you halve your money, you are subtracting half its value from what you had. So it is ultimately a matter of making (adding) and losing (subtracting) money. I don't think this lies at the heart of the problem. If Tomas can answer my question about the 2/3 of Z, he will satisfy me that his solution is correct! 


Steve, almost right, but assuming that you do switch the final outcome was predetermined when you selected the first envelope (X). If you selected the larger amount you lose X/2, if you selected the smaller you gain X  but unfortunately (mathematically speaking) X/2 is equal to X  there is no point switching. 


John  X/2 does not equal X :) Tomas  if Z is the maximum total of both envelopes, surely it is impossible for X to ever be *greater* than 2/3 of Z. Equal or less than, yes, but not greater. What ya say to that, ya little rascal?! 


Hehehe, I already told you that was an error. This is what it should sound like. I just need a good way to make the random variables Y and Y/2 so their sum always is less than Z. That means that Y+Y/2<=Z which means that 3/2*Y<=Z which means that Y<=2/3*Z. Let's just form W=2/3*Z and follow the reasoning in my post: Choose a Y randomly that is less than W and put in one envelope. Put Y/2 in the other. The sum is now automatically random but less than Z. We are back in the same case I described where there are double as many cases where we find X <= W/2 than we find above W/2. Ok? ;) /Tomas 


[quote] On 20060329 10:37, Steve Martin wrote: John  X/2 does not equal X :) [/quote] Steve in this case if you have X and switch for X/2 your loss is equal to your gain had you had X and switched for 2X. 1/2=1 


John, please don't say that, because I'd gladly pay a dollar to flip a coin and you pay me 2 dollars if it shows heads and you pay me 50 cents if it shows tails. /Tomas 


Ah, yep  sorry, Tomas, I hadn't appreciated from your earlier reply that it was an error... I see it now. We set a combined maximum total of Z. For every value we choose (for the first envelope) that is between 1/3 of Z and 2/3 of Z, there is a second value (for the other envelope) that is between 0 and 1/3 of Z. AND  for every value we choose (for the first envelope) between 0 and 1/3 of Z, there is a second value (for the other envelope) that is between 0 and 1/3 of Z. THEREFORE, we generate twice as many possible values between 0 and 1/3 of Z, as we do between 1/3 of Z and 2/3 of Z. And that is the key which gives rise to the revised expectation equation: 2/3 * (0.5 * X/2 + 0.5 * 2X) + 1/3 * (1 * X/2 + 0 * 2X) = X You little beauty! Full credits to Tomas and Andrei for being utterly brilliant :)  I now understand this! 


[quote] On 20060329 10:55, TomasB wrote: John, please don't say that, because I'd gladly pay a dollar to flip a coin and you pay me 2 dollars if it shows heads and you pay me 50 cents if it shows tails. /Tomas [/quote] Not the same situation.. I'll play if you give me half what you have when I win and I double what you have if I lose.  My expectation of losing is zero! .. as losing double is the inverse of winning half! They are the same! 


[quote] On 20060329 11:20, magicjohn2278 wrote: [quote] On 20060329 10:55, TomasB wrote: John, please don't say that, because I'd gladly pay a dollar to flip a coin and you pay me 2 dollars if it shows heads and you pay me 50 cents if it shows tails. /Tomas [/quote] Not the same situation.. I'll play if you give me half what you have when I win and I double what you have if I lose.  My expectation of losing is zero! .. as losing double is the inverse of winning half! They are the same! [/quote] Ok, I hold one of my own dollars. When the coin shows tails I pay you 50 cents and when the coin shows heads you pay me what? Please explain the rules. /Tomas 


Right, you start off with say $1000, If I win you give me half what you have, If I lose, I will double whatever you have. If on the first go, I lose, I give you $1000, you now have $2000, on the next turn, if I lose, I give you $2000.... But if on the first go, I win, you give me just $500, and If I win again, you give me just $250 I can't offer better than that! You are a sure winnner! 


I see. Funny game, just a random walk where I can never loose more than 1000 dollars but can gain an infinite amount. I can in other words decide that I want to stop playing when I'm above 1000 000 dollars. Since it's a random walk that will eventually happen and the game does not have a stop because my money can only be halved but never gone completely. Not sure why you posted it here though. In your game you can never find 687 dollars at any stage, for example, since you just move in discrete steps that are very uneven in size...which is very different from the envelope game. /Tomas 


No it isn't it is the same. You can start with any amount you wish 687 dollars or whatever... the rules won't change. So what is your expected gain over say 100 games? (or to make it easier, 10.) 


[quote] On 20060329 12:31, TomasB wrote: Not sure why you posted it here though. ........for example, since you just move in discrete steps that are very uneven in size...which is very different from the envelope game. /Tomas [/quote] The envelopes vary in identical steps, <...1/2...1...2...> 


Expected gain or no expected gain, he'll just play and play and play until his earnings exceed an amount favourable to him, assuming, of course, that he can do this within his life span. It takes time to flip coins, after all. Andrei 


The point I am trying to make is if you have envelope X, and switch, you will either double or half what you have. With equal probability. If X = 10 it is worth the risk of losing 5 against the possibility of gaining 10. But it only looks that way because we imagine that the other envelope might contain either 5 or 20. It is the wrong way of looking at it. It contains either half or double what we hold. We are imagining there is a potential range of envelopes X/2..X...2X.(which there isn't.) We say that doubling X is better than losing half X because we convert X/2 and 2X to numbers that we are used to dealing with. We think we know that losing half what you have has less negative effect on our expected gains than doubling what you have. It doesn't  the effects are the equal and opposite. If I can show this, then our original thinking is proved wrong. It;s too simple to ues our two envelope model for this.. you would just say, "well what did you expect!" when I show you the answer, so I have to make the game a little more complicated!  But it IS the same game. I give you an envelope containing X, if you wish, I will switch it for another which contains either half or double what you hold... then I offer you the same deal, over and over again. "Winning" is switching for the envelope containing double the value of your current envelope. "Losing" is switching for the envelope containing half the value of your current envelope. We are going to win two games then lose two games, obviously, as we know that when we lose we only lose 1/2 of what we have, and when we win we double what we have we will show a profit at the end. Your starting envelope contains say $40, you switch and win envelope with $80 in. You switch and win an envelope containing $160. Two wins. From this position, you are going to lose 2 games.. Switch and lose you now have $80. Switch again, lose and your envelope contains $40. We won two games, we lost two games, we have exactly the same as we started with. What went wrong? 


Here is a summary of the solution in words (credit to Tomas and Andrei for getting us to this point!)  If you take every possible pair of envelopes, every value that is equal to or less than onethird of the maximum combined total of both envelopes, will appear twice as often as every value that is greater than onethird of the maximum combined total of both envelopes. Therefore, when you choose one of the envelopes, it is twice as likely that when you swap, you will go either up or down, as it is that when you swap you will go down. Hence, when you swap your envelope containing X, you expect to get: 2/3 * (0.5 * X/2 + 0.5 * 2X) + 1/3 * (1 * X/2 + 0 * 2X) = X and therefore there is no advantage in swapping. 


I suggest this is a good point on which to end this topic. Spinoff topics, please start a new thread. Thanks to everyone who contributed... it was a good one. 


For every possible value of the greater envelope, there is a corresponding "lesser" envelope. (And vice versa) the chances of picking either are 50/50. Give me any value of the greater envelope and I will tell you the contents of it's corresponding pair. (But not infinity!) 


But this time it actually was better to switch. Oh well I guess half a million dollars is pretty good. But I tried to give you a chance. I thought for sure you'd get the hint when I offered you the chance to switch. Oh well.... (under breath) loser. 


Magicjohn, to say that the chances are 50/50, is to miss the point of what we've been discussing. You are twice as likely to pick values in the middle range of given envelopes, than you are to pick values in the extremes. Thus, if you switch a value which is not in the middle range, chances are you will get something in the middle range, rather than something even more extreme. That is the whole idea of the paradox. You THINK it's 50/50, but in reality, it isn't. Andrei 


I know that it is coming up to April 1st, but surely noone believes this! Just because you are more likely to pick a low value for X doesn't mean that that when you switch you are less likely to go down! If Z = 300 Values 1 to 100 can appear twice (in either envelope) switch and go up or down Values 100 to 200 only appear once (in either envelope) switch and go down Consider each envelope Values for Envelope A 100 to 200 1 to 100 1 to 100 Values for Envelope B 1 to 100 1 to 100 100 to 200 Match up the values with the corresponding envelope Value Envelope A……Value Envelope B 100 to 200…………….1 to 100  A is high B is low 1 to 100………………..1 to 100 A and B could be high or low 1 to 100100 to 200 A is low B is high So it is more likely that when we pick a value for X it will fall in the range 1 to 100, but nevertheless when we switch, the probability of switching for more or less is 0.5 each. 


"1 to 100100 to 200 A is low B is high " How do you figure that if one envelope has 5 (it's from 1 to 100) the other envelope can have between 100 and 200? Seems like that can never happen yet you have it as a case there. /Tomas 


John, please reread Tomas' post above with the middle third bit and the probability dispersion of the envelopes. Andrei 


What I am saying is that when the value for B is in the middle third then the value for A has to fall into the low third. ie if there are 100 instances where B is in the Middle third, there are 100 instances where the value of A falls in the low third. And in every case, if you pick A and switch, you will switch for more. (OK, I've slipped up here I should be referring to the cases where B is in the top third. then A must fall in the bottom third (of the finite total)  but the maths will still work out the same!)  If you want me to clarify then I will. 


Clarification (and corrections!) Lets label the “thirds” of Z …. Z1, Z2 and Z3 (Z1 is bottom third, Z3 the high third.) The argument is that if you pick x from a value in Z3, switch and you can only go down. True. If you pick X from Z1 or Z2 then you can go either up or down. True. As there are more values that fall into Z1 or Z2 than in Z3 you are more likely to pick one of these. True. You are twice as likely to pick a value for X that can go up or down, than one that will go down. Not quite true! There are two envelopes Values from Z3 can appear once in either envelope. Picking these values and switching will result in a loss. There are an equal number of corresponding values from Z1 that can appear once in either envelope. Picking these values and switching always results in a gain. (the corresponding value from Z3) There are an equal number of values from Z1 and Z2 that can appear in either envelope picking these values will result in a gain 50% of the time and a loss 50% on the time. There is a 1/3 chance that we will pick a Z3 value, switch and we always lose. There is a 2/3 chance that we will pick a Z1 or Z2 value, switch and we can win or lose. But when we switch our Z1 or Z2 value, we will only lose ¼ of the time. 1/3 times we pick Z3, switch and lose 1/3 times we pick a Z1 that results in a switch and gain 1/3 * ½ times we pick Z1 or Z2 switch and gain 1/3 * ½ times we pick Z1 or Z2 switch and lose Expectation of switching and winning (1/3 * 1)(1/3*1)+(1/3*1/2)(1/3*1/2)=0 


[quote] On 20060330 06:20, magicjohn2278 wrote: Clarification (and corrections!) Lets label the “thirds” of Z …. Z1, Z2 and Z3 (Z1 is bottom third, Z3 the high third.) The argument is that if you pick x from a value in Z3, switch and you can only go down. True. [/quote] Sure that's true, but didn't you miss some cases? Can you, for example, select the biggest value in Z2 and still double it without getting out of range? Your reasoning is all over the place so I can't get heads nor tails about it. The easiest way to think about it is not where you can go from each value, but how you can _reach_ each value. Focus on a value and reason in how many ways that value can be placed in an envelope. Either you have seleced double that value to be placed in an envelope and this is half of that selected value, or you decided directly to place this value in an envelope (and placed half of it in the other envelope). If the value you are focusing on is too big, you can only select to place it in an envelope in one way, by selecting it immediately. There is no double value that can be selected that would generate this value in an envelope. /Tomas 


There is only one value in Z2 than you can double without going out of range, that is Z/3  perhaps the range for Z2 should start at (Z/3)+0.000001.. I don't understand you reasoning in this matter either! ... however, I think we are all getting a little bored with this and it's probably time to move on to something else! :) 