Username: Password: [ Lost Password ]  [ Forgot Username ]
 The Magic Cafe Forum Index » » Puzzle me this... » » Two envelopes (0 Likes)

 Go to page [Previous]  1~2~3~4~5~6~7~8 [Next] Steve Martin Inner circle 1119 Posts Posted: Mar 28, 2006 10:55 am    0 Andrei - it seems to me to be the perfect answer. Thanks! Any man who reads too much and uses his own brain too little falls into lazy habits of thinking. Albert Einstein TomasB Inner circle Sweden 1143 Posts Posted: Mar 28, 2006 11:56 am    0 That's not really explaining where the error in the paradox is, since John's suggestion is valid. Let's say you open the envelope en find 100 dollars. You do _not_ find 200 dollars and you do _not_ find 50 dollars. Let's focus on only the times you play and open the envelope and find 100 dollars. Will there be a 50/50 shot of there being 50 dollars or 200 dollars in the other envelope? John simulations seemed to show that it actually is, and that has the strange result that the expected value of the other envelope is 125 dollars. Again, I _know_ how to show that there should not matter if you switch, but I can't put my finger on the error in the above reasoning. Let's apply this to the scenario Andrei wrote of. We focus ONLY on the times X=100. 25 of the cases we switch and find 50 according to him. In 25 of those cases we switch and find 200. The average of all those 50 trials when we actually find 100 will be (25*50 + 25*200)/50 = 125. /Tomas Andrei Veteran user Romania 353 Posts Posted: Mar 28, 2006 12:05 pm    0 Like I said, if you're picking the times where you find 100 dollars in it, then you can't go ahead and stat it out over extended periods of time. The mistake is that you focus ONLY on the timex X=100, because for those times, sure, you're better off switching. But, if the picks are random, you have to take into account what happens when you ALSO pick the other alternatives. Otherwise, like I said, the choice is not 'random' any more. Andrei Daegs Inner circle USA 4283 Posts Posted: Mar 28, 2006 12:34 pm    0 But andrei, what if we don't *know* that 50, 100 and 200 are the values? That is a key element to the problem we don't know what is in the envelopes except that they are double/half themselves. So if we open up that envelope with 50, it could be 25 or 100, as if we open up 200, we could have 100 or 400.... The key element to the problem is that you don't know whats in the other envelope. Andrei Veteran user Romania 353 Posts Posted: Mar 28, 2006 12:41 pm    0 If the values are all different, then the math becomes very complex but blurts out the same result at the end. Remember, 'we' are gods in our little experiment, and while we know what the values are, THE PLAYER DOESN'T. Again, we know because we were keeping track, and we are omniscient, but the player didn't, and he exhibited perfectly rational behavior, i.e. either all switch or all stay. Andrei magicjohn2278 Special user Isle of Man UK 538 Posts Posted: Mar 28, 2006 12:45 pm    0 Just musing... the opposite of doubling something is halving it. ... the opposite of halving something is to double it.... so why aren't 2*X and X/2 equal and opposite? .. beats me! TomasB Inner circle Sweden 1143 Posts Posted: Mar 28, 2006 01:19 pm    0 Quote:On 2006-03-28 13:05, Andrei wrote: ..the timex X=100, because for those times, sure, you're better off switching. Ok, this is interesting. I just got you to agree that the times we find 100 dollars in the envelope we are better off switching. Read my post again that made you say that and read 44 dollars instead of 100. All of a sudden you will agree that the times we find 44 dollars in the envelope we should switch. Read that post again but imagine that I wrote 230 dollars instead of 100 dollars. You will now agree that we gain by switching the times we find 230 dollars in the envelope. I find it really hard to point out the error in the reasoning once you have agreed that we gain by switching the times we find 100 dollars. Once you agree to that you are trapped in the reasoning that tells you to switch whatever you find in the envelope. It's strange indeed. /Tomas Jonathan Townsend Eternal Order Ossining, NY 27096 Posts Posted: Mar 28, 2006 01:47 pm    0 Aha! the monkey holds the coconut! The monkey is trapped trying to compute based upon gains and losses based upon an unknown and random variable. this monkey lets go, he has nothing. Starting the story again, he takes an envelope and has on average (1.5)X meaning half the trials, x and half the trials 2x. But wait, what about exchanging envelopes? Well, the monkey puts down the envelope so again has nothing. Then gets either x or 2x again. This monkey is satisfied and is going for a banana. :) ...to all the coins I've dropped here Andrei Veteran user Romania 353 Posts Posted: Mar 28, 2006 01:53 pm    0 Thomas, Again, you're better of switching if you hold 100 and you KNOW that the other envelope has 200, which never occurs in the real scenario because you never actually know what's in the other envelope. That was the error I was trying to point out. So, in other words, you must generate a plausible scenario where you distribute the picks properly. If you don't, and you say, "let's analyze only the X=100 scenario" you're making the mistake I'm about to (intentionally) make with the following analogy: I want to prove that when you flip a coin and catch it, you can ALWAYS get it to fall tails if you flip it round once more (just one full rotation). To analyze if this is true or not, I will only look at the cases where it falls heads. Obviously, by turning it round, it will fall tails. Thus, my hypothesis has been proven. Obviously, it hasn't, but unfortunately, in the 2 envelopes mystery, it is not as "obvious" that the very same mistake is being made. Whenever you choose to fix one value, you are anulling the 'randomness'. If you say "then assume it's 44, then assume it's 233" well then that's okay, but you're just giving out a predetermined set of fixed values, upon which you CANNOT calculate the (x/2+2x)/2 average, because there exists no average to speak of, as you've selected no variables which have random probabilities. You're just giving out a set of constants. Andrei TomasB Inner circle Sweden 1143 Posts Posted: Mar 28, 2006 02:38 pm    0 Andrei, it is ok to speak of expected values and conditional probabilities in probability theory. Do you agree with that if you find 100 in the envelope you will have 50 in the other envelope half of the time and 200 the other half of the time? You did state that yourself as that happened 25 times each in 50 trials. /Tomas Andrei Veteran user Romania 353 Posts Posted: Mar 28, 2006 02:44 pm    0 ThomasB - I hope I can be clearer now. Let's assume you play 100 rounds of this game, and you find 100 in your envelope every time. Yes, half the time the other envelope will be 50, half the time the other envelope will be 200. However, and here is the absolute crucial point, which explains the whole apparent paradox (please try to grasp what I'm saying, and I know it will take some effort because I feel myself limited when it comes to claryfying my point, and I know it doesn't come through perfectly): If you play 100 rounds of this game with those types of envelopes, you will NOT always draw the one with 100 in it. That is what explains everything. You are generalizing an ungeneralizable situation. Andrei Steve Martin Inner circle 1119 Posts Posted: Mar 28, 2006 03:03 pm    0 Andrei - I think I understand what you are saying. All along we have been saying, if you find X in the envelope, then the other envelope contains either 2X or X/2. Well, that is true... but the fact is it can't be both - it has to be one or the other depending on the actual state of affairs. VIEW A ----------- If the envelope hold contains \$100, then there are two possibilities: 1. \$200 and \$100 2. \$100 and \$50 The "X" you are talking about having found, is in fact, in case 1: \$100 (when it could equally have been \$200), and in case 2: \$100 (when it could equally have been \$50). VIEW B ----------- Now... supposing we hold an envelope with \$100 in it. By the same logic as above, there COULD have been two possible sets of envelopes as follows: 1. \$100 and \$50 2. \$50 and \$25 ... and so if that were the case, we could NOT say that it was equally likely that the other envelope contains \$50 as \$200, since \$200 actually plays no part in the scenario. All we can say is that on this occasion, we happened to choose the envelope that contained the larger amount. If we had chosen one with \$50 in it (i.e. the "middle" value of the three values), we would have been in an equivalent situation to the one described in VIEW A. What the above means is that, having chosen an envelope, you cannot automatically include in your subsequent deliberations a pair of envelopes that does NOT, in fact, apply (which is what we HAVE been doing in that equation for the expected gain). Any man who reads too much and uses his own brain too little falls into lazy habits of thinking. Albert Einstein TomasB Inner circle Sweden 1143 Posts Posted: Mar 28, 2006 03:36 pm    0 Quote:On 2006-03-28 15:44, Andrei wrote: ThomasB - I hope I can be clearer now. Let's assume you play 100 rounds of this game, and you find 100 in your envelope every time. Yes, half the time the other envelope will be 50, half the time the other envelope will be 200. However, and here is the absolute crucial point, which explains the whole apparent paradox (please try to grasp what I'm saying, and I know it will take some effort because I feel myself limited when it comes to claryfying my point, and I know it doesn't come through perfectly): If you play 100 rounds of this game with those types of envelopes, you will NOT always draw the one with 100 in it. That is what explains everything. You are generalizing an ungeneralizable situation. Andrei So let's say that you play the game N number of times where N is a huge number after which you have found 100 dollars in your envelope exactly 100 times. Will you have found 200 dollars in the other envelope 50 of those times? John's simulations showed that that is the case. So in probabilistic language: The conditional probability, given that 100 dollars is found, of the other envelope having 200 dollars is 0.5 and the conditional probability of the second envelope having 50 dollars is 0.5. Please write your own program to verify if that is the case. If that shows to be true, then I think you can change the parameters of the program to only count the times you find 50 dollars in the envelope. You will probably find that 50% of those times you find 25 dollars in the other and the other 50 times you find 100 dollars in the other envelope. I'll repeat John's numbers: "10127447 trials gave 10149 envelopes with 100 in, and in 5069 of these cases the other envelope contained the 200." That _actually_ means that if he never switched the times he got 100, his average would be 100. If he always switched his 100 his average would be 124.92 dollars. Anyone else care to simulate? Since I know the definitions of expected value and conditional probability I can't see any flaw in the way I'm using them, yet I know there has to be a flaw. But what is it? Wonders, /Tomas Psy-Kosh Regular user Michigan 135 Posts Posted: Mar 28, 2006 04:35 pm    0 I _THINK_ what's going on is this: if we allow the initial probabability distribution to be such that all positive values are equally probable, ie, an unbounded flat distribution, then, near as I can tell, for any envelope, prior to peeking in it, the expected value is infinite. But that's greater than any possible expected value. When you actually peek inside of one envelope, you see a finite value, and then that shifts around the probabilities for the second envelope to only two possibilities. I _think_, in these circumstances, maaaaaaaaybe, switching envelopes may actually be the correct thing and nonparadoxical. Maybe. Given a bounded, or otherwise well behaved prior distribution, ("Well behaved" being defined here as "The expected value is not larger than every possible value you could get") I suspect that the wacky paradoxy stuff goes byebye. Actually, a flat unbounded distribution is pretty weird anyways, would be infintesimal everywhere. Probably could only be defined in terms of limits anyways. (Sort of like an anti-dirac-delta) magicjohn2278 Special user Isle of Man UK 538 Posts Posted: Mar 28, 2006 05:03 pm    0 Tomas, Can you calculate the expected value like this? Half the time the total value of the envelopes (T) is 3X and you hold X ....(1/3 T) Half the time the total value of the envelopes (t) is 1.5X and you hold X ....(2/3 t) So: Expected gain 0.5 * (2/3T - 1/3T) = 1/6 of the total value plus 0.5 * (1/3t - 2/3t) = -1/6 of the total value .. so your expected gain is 0 ....no, I didn't think so! The problen is, we are using 3 envelopes and there are only 2! Your expected gain should be 0.5 * (2X-X) = 0.5X ...when you had the smaller envelope plus 0.5 * (X-2X) = -0.5X ..when you had the larger! .. which equals 0. Daegs Inner circle USA 4283 Posts Posted: Mar 28, 2006 07:04 pm    0 To all: The problem is not to prove that its 50/50, the problem is to disprove the calcs that show its not... By the way, I just thought of a deviously evil problem: 1: You are given two envelopes and told one is double the value of the other. 2: You are allowed to open one and see the value. 3: You are allowed to switch and open that envelope to see if you got the bigger or smaller. 4: The first Envelope is then replaced with another that you are told contains either double or half of the value in your current envelope. 5: Repeat from step 3. What is the best strategy in this game? (I believe same paradox applies, except that while the low end will only approach 0, the high end approaches infinity so you are always better switching... This combines the St. Petersburg puzzle with the envelope problem... very very evil yet without the problem of zero'ing right away with Petersberg, whereas this is gradual and changes things). Intersting since we know if it truly IS 50/50, than a typical run might go. WLWL and thus 200-100-200-100 or LWWL and thus 50-100-200-100 WLLW and thus 200-100-50-100 So it's obvious that its 50/50, yet the chance at growing exponetially vs. only dropping by lower and lower amounts seems to make switching at all times a better strategy... yet it should be 50/50? Bill Hallahan Inner circle New Hampshire 3220 Posts Posted: Mar 28, 2006 09:32 pm    0 Daegs, on page 1 you already disproved the equation that is in the second post in the topic (an equation that TomasB pointed out was invalid in that post). There is another way to refute that equation that applies to all the other problems here. The faulty logic is excluding possiblities. This is done by using a choice to start in the middle of a problem. This is not valid in itself. One has to consider the probabilities related to that choice in the context of the entire problem. In other word, possibilities are not properly enumerated because of starting in a predetermined state. For the envelopes, once a choice is made, while there is a 50% chance of ending in one of two states, there are four sequences to get to any end state. The choice makes it easy to miss that there are four possibilites, not two or three. These are: 1. You choose envelope A (X), you do not switch obtaining X. 2. You choose envelope A (X), you switch obtaining 2X. 3. You choose envelope B (2X), you do not switch obtaining 2X. 4. You choose envelope B (2X), you switch obtaining X. Thus, the equation in the second post in the topic is incorrect because it posits two possibilites. There are four. TomasB wrote: Quote:I'll repeat John's numbers: "10127447 trials gave 10149 envelopes with 100 in, and in 5069 of these cases the other envelope contained the 200." That _actually_ means that if he never switched the times he got 100, his average would be 100. If he always switched his 100 his average would be 124.92 dollars. Anyone else care to simulate? Since I know the definitions of expected value and conditional probability I can't see any flaw in the way I'm using them, yet I know there has to be a flaw. But what is it? The presumption that if he never switched, he'd get 100 isn't correct. One would expect that half the time he would choose the envelope with 200 before not switching. (Note, the presumption is no knowledge of amounts retained from trial to trial). Or, it's that there are four possibilities, not two or three. If we presume a perfectly rational subject, they'll realize that the two choices they have are arbitrary. Thus each have an equal probability of 0.25 = 0.5 times 0.5. 1. 0.25 * X 2. 0.25 * 2X 3. 0.25 * 2X 4. 0.25 * X ------------ = (0.25 * 6)X Thus the expected value for gain is 1.5 X. Also, the expected value is the same whether the subject switches envelopes or not. This makes sense, since it is the average of X and 2X. Daegs, your last problem could be solved by enumerating all the possibilities. Most (all?) probability paradoxes I've seen use various tricks to hide some possibilities. Humans make life so interesting. Do you know that in a universe so full of wonders, they have managed to create boredom. Quite astonishing. - The character of ‘Death’ in the movie "Hogswatch" TomasB Inner circle Sweden 1143 Posts Posted: Mar 28, 2006 10:59 pm    0 Quote:On 2006-03-28 22:32, Bill Hallahan wrote: The presumption that if he never switched, he'd get 100 isn't correct.If you open the envelope and find 100 and do _not_ switch, surely you can have nothing but 100? That's what the simulation was all about. You only studied the cases where the random envelope you picked actually had 100 in it because you opened and checked it. John, expected value is the sum of all possible outcomes weighed by their respective probabilities of happening. That's the definition. To calculate it you first need to decide _what_ you want to calculate the expected value of. You can't just say "expected value" but you should say "the expected value of the other envelope" or maybe "the expected gain if you switch" or "expected sum of both envelopes". Otherwise it's impossible to know what you are talking about and trying to calculate. Still, it's not about finding a way to show that it should not matter if you switch, you should find _what_ the flaw in the reasoning that tells you to switch is, _not_ find that it _is_ flawed because we already know that. /Tomas Daegs Inner circle USA 4283 Posts Posted: Mar 29, 2006 01:21 am    0 Quote:Daegs, your last problem could be solved by enumerating all the possibilities. Actually you'll run yourself into a corner if you try that... It's basically the St. Petersburg problem only more evil...(St. Petersburg basically states in a game of flipping a coin, you lose on tails but your winnings are double if you get heads... the math works out such that since it goes to infinity for all heads, then you should bet any amount of money on the game no matter how slim the chance is that you win, due to never being able to spend infinity, when you can gain infinity) it requires some math to prove otherwise. I'd wager that in this hybrid, this is even eviler, since you are still going to infinity but as you go lower, you only lose fractions of cents and never reach 0.... so that it's even better to switch. So just on a 50/50 chance alone it would be in your best interest to switch, however due to the envelope problem it becomes that even moreso(as you could easily see that losing twice from \$100 is only \$25 but winning is \$400). As far as simulating... I'd need a better computer as I get wildly varying results.. Currently with the average of 10k runs with 100 switches each: If you run the same simulation with 1/2 the time getting .5, and 1/2 the time getting double, then your ranges of winning jump to around \$95 - \$3,355,549,005(the highest average of 10,000 runs of 100 switches I saw). most of the time its a couple hundred thousand or a couple mil. If you change the number of switches to 10 (with the half/double structure) you get a couple of hundred on average(obviously this lowers the range). Well accourding to the simulation, it's practically always a good chance to switch... in fact I'd pay a couple thousand just to play. TomasB Inner circle Sweden 1143 Posts Posted: Mar 29, 2006 03:19 am    0 I've find a way to point out the _exact_ flaw. It is reasonable to assume that the envelopes are filled as follows. In one envelope an amount Y is put which is squarely distributed from 0 to Z. We do not know Z, but enough is to know that such an upper boundary must exist. In the other envelope Y/2 is always put. Statements: 1. If I open my envelope and find X there is either X/2 or 2X in the other envelope. (That is absolutely true.) 2. There is a 50/50 chance of each of these two cases. (Not true. THAT IS THE FLAW. The probabilities are dependant on X in relation to Z.) So what is the breakpoint for when the probability goes from 50/50 to 100/0? At X > Z/2 there is probability 1 of the other envelope having X/2 and probability 0 of the other envelope having 2X. At X <= Z/2 we have the 50/50 case. So what is P(X <= Z/2) / P(X > Z/2)? There is only ONE way to get to each X that is bigger than Z/2 while there are double as many ways to get to an X that is less than or equal to Z/2 since you could get it in an envelope either by the selected Y being X or the selected Y being 2X. Twice as many ways. That means that the relative probability to have X <= Z/2 is 2/3 while the relative probability of X > Z/2 is only 1/3. Now we are ready to write the expected value of the other envelope when we find X in the envelope we selected: 2/3 * (0.5 * X/2 + 0.5 * 2X) + 1/3 * (1 * X/2 + 0 * 2X) = X That expression clearly shows the flaw in assuming 0.5 probability of finding X/2 and 2X in the other envelope. Note that I have not specified the boundary Z to get this result - I have only said that it exists. *sigh of relief* /Tomas The Magic Cafe Forum Index » » Puzzle me this... » » Two envelopes (0 Likes) Go to page [Previous]  1~2~3~4~5~6~7~8 [Next]
 [ Top of Page ]
 All content & postings Copyright © 2001-2020 Steve Brooks. All Rights Reserved. This page was created in 0.22 seconds requiring 5 database queries.
 The views and comments expressed on The Magic Café are not necessarily those of The Magic Café, Steve Brooks, or Steve Brooks Magic. > Privacy Statement <