The Magic Café
Username:
Password:
[ Lost Password ]
  [ Forgot Username ]
The Magic Cafe Forum Index » » Magical equations » » Probability question in "Penny For Your Thoughts" (0 Likes) Printer Friendly Version

 Go to page 1~2 [Next]
Scott Cram
View Profile
Inner circle
2677 Posts

Profile of Scott Cram
Up in "Penny For Your Thoughts", there's a thread called Bank Night Probablities.

The central question is the true probabilities of Richard Osterlind's "Bank Night" presentation, as shown on Mind Mysteries 1. I haven't seen the routine myself (and can't find a video of it online anywhere), so I can't answer the question. There seems to be a debate about whether the "Monty Hall Paradox" applies to the probabilities.

If anyone here has seen it, and would like to comment on it, it would probably help a great deal.
ddyment
View Profile
Inner circle
Gibsons, BC, Canada
2274 Posts

Profile of ddyment
I looked at the presentation (after first answering the question incorrectly!), and it's definitely the Monty Hall Problem (more traditionally called the Three Prisoners Problem; the more recent name comes from a Marilyn Vos Savant column in Parade magazine some years back).

In Monty (or the Prisoners), there are only three objects from which to choose; in Osterlind's routine, there are five. But the mechanism is the same: the target subject gets to choose an envelope (a door); the person who knows the location of the object eliminates all but one of the alternatives; the target subject is offered the option of switching her original choice with the remaining alternative before the envelope (door) is opened (which, if the game were fair, would yield an 80% probability of winning, rather than her original 20%).
"Calculated Thoughts" now available at The Deceptionary :: Elegant, Literate, Contemporary Mentalism ... and More
M. H. Goodman
View Profile
Regular user
Dublin, Republic of Ireland
169 Posts

Profile of M. H. Goodman
Yes, I agree; this is definitely comparable to the Monty Hall Problem. In his Bank Night routine, Richard has five envelopes, one of which he says contains a $100 bill. Four members of the audience each select an envelope, leaving Richard with an envelope for himself. Three of the four participants are then directed to open their envelopes, which are found to contain pieces of blank paper. This leaves Richard and a single audience member with unopened envelopes.

If the effect was presented as one in which Richard had no idea which envelope contained the money, and it was purely a matter of chance that the three eliminated envelopes happened to be ones containing paper, then both Richard and the remaining participant would each have a 50% chance of holding the envelope that contained the money.

However, this is not the scenario that is presented in the routine; the envelopes are clearly numbered, and Richard states that he himself put the bank note into one of them. In other words, he knows which envelope contains the money, and when he eliminates three of the envelopes, he does so in the knowledge that they are ones which contain blank paper.

Now, if one assumes that each participant has a fair choice (and we’ll come back to this point in a moment), then there is an 80% chance that the money will end up with one of the four participants. It therefore follows that, since Richard eliminates only envelopes which he knows do not contain the money, there is an 80% chance that the remaining participant has the winning envelope.

But as I noted earlier, this reasoning is based on the assumption that each participant has a fair choice. The subtext of the routine, however, is that Richard is somehow able to influence their choices. Depending on how adept he is at doing so, the probability of the remaining participant having the money could be anywhere between (and including) 0% and 80%. When viewed like this, one could justifiably claim that maybe the participant has only a 50% chance after all.
Coinucopia available now at http://coinucopia.blogspot.com.
Platt
View Profile
Inner circle
New York
1920 Posts

Profile of Platt
I haven't seen the effect but it sounds like the Monty Hall Paradox (which isn't a true paradox). With just 5 envelopes it's very hard for most people to work out the true odds in their head. Even harder to work out with 3 (as in the case of monty hall). 50/50 seems to be the odds. However if you used many envelopes, say 100, it would become very obvious that there's a 99% chance the spectator's envelope has the bills and a 1% chance Richard's envelope has them.
Sugar Rush is here! Freakishly visual magic. http://www.plattmagic.com
ddyment
View Profile
Inner circle
Gibsons, BC, Canada
2274 Posts

Profile of ddyment
Platt's response is a good indication of how easily this problem confuses people.

In fact, the odds that are associated with the participant's envelope do not change from what they are initially. So in the five-envelope case, there is a 20% chance that the participant's envelope holds the reward; in the hundred-envelope case, the participant's envelope has a 1% chance of being lucky. Nothing the entertainer can do (short of exposing all the envelopes) will change this. What changes are the odds of the other remaining envelope (the one not exposed by the person who knows where the prize is). In the five-envelope case, this envelope has an 80% probability of being the winner; in the hundred-envelope case, a 99% chance.

So it's always advantageous (in a fair game) for the participant to exchange her original choice with the remaining unexposed one.
"Calculated Thoughts" now available at The Deceptionary :: Elegant, Literate, Contemporary Mentalism ... and More
Platt
View Profile
Inner circle
New York
1920 Posts

Profile of Platt
Quote:
On 2008-12-04 23:39, ddyment wrote:
Platt's response is a good indication of how easily this problem confuses people.

In fact, the odds that are associated with the participant's envelope do not change from what they are initially. So in the five-envelope case, there is a 20% chance that the participant's envelope holds the reward; in the hundred-envelope case, the participant's envelope has a 1% chance of being lucky. Nothing the entertainer can do (short of exposing all the envelopes) will change this. What changes are the odds of the other remaining envelope (the one not exposed by the person who knows where the prize is). In the five-envelope case, this envelope has an 80% probability of being the winner; in the hundred-envelope case, a 99% chance.

So it's always advantageous (in a fair game) for the participant to exchange her original choice with the remaining unexposed one.


Were you suggesting my respose indicates I'm confused? If so, I'm curious why you thought that. I'm fascinated by the Monty Hall Paradox and actually understand it quite well. Perhaps I've misinterpreted what you said.
Sugar Rush is here! Freakishly visual magic. http://www.plattmagic.com
ddyment
View Profile
Inner circle
Gibsons, BC, Canada
2274 Posts

Profile of ddyment
I felt that Platt's description illustrated how confusing all this can be because he wrote, "... there's a 99% chance the spectator's envelope has the bills and a 1% chance Richard's envelope has them." In fact, there's a 99% chance that Richard's envelope would hold the money, and only a 1% chance that the spectator's does so.
"Calculated Thoughts" now available at The Deceptionary :: Elegant, Literate, Contemporary Mentalism ... and More
Platt
View Profile
Inner circle
New York
1920 Posts

Profile of Platt
Quote:
On 2008-12-08 17:26, ddyment wrote:
I felt that Platt's description illustrated how confusing all this can be because he wrote, "... there's a 99% chance the spectator's envelope has the bills and a 1% chance Richard's envelope has them." In fact, there's a 99% chance that Richard's envelope would hold the money, and only a 1% chance that the spectator's does so.


Ahh, I actually don't know the effect. But assuming Richard is the equivalent of Monty Hall, that of course would be the case. The spectator obviously doesn't have a 99% chance of grabbing the 1 envelope in 100. And that's my point with using 100 unknowns. To best illustrate the true odds behind the MH paradox, you should add the number of doors/envelopes. With 3 doors it's extremely difficult to deduce that you're trading in a 1 in 3 chance for a 2 in 3 chance. With 100 it's quite obvious that you're trading in the 1 in 100 odds for 99 in 100 odds.
Sugar Rush is here! Freakishly visual magic. http://www.plattmagic.com
S2000magician
View Profile
Inner circle
Yorba Linda, CA
3465 Posts

Profile of S2000magician
Quote:
On 2008-12-04 23:39, ddyment wrote:
Platt's response is a good indication of how easily this problem confuses people.

In fact, the odds that are associated with the participant's envelope do not change from what they are initially. So in the five-envelope case, there is a 20% chance that the participant's envelope holds the reward; in the hundred-envelope case, the participant's envelope has a 1% chance of being lucky. Nothing the entertainer can do (short of exposing all the envelopes) will change this. What changes are the odds of the other remaining envelope (the one not exposed by the person who knows where the prize is). In the five-envelope case, this envelope has an 80% probability of being the winner; in the hundred-envelope case, a 99% chance.

So it's always advantageous (in a fair game) for the participant to exchange her original choice with the remaining unexposed one.

Unfortunately, this analysis, too, is flawed. The key is the sentence, "Nothing the entertainer can do . . . will change this."

There is, in fact, something the entertainer can do to change this: he gets to decide which spectator to address.

If one of the spectators has the notes, the entertainer is restricted in his choice: he must choose exactly those three spectators who don't have the notes and ask them to open their envelopes. However, if the entertainer has the notes, he has free choice over which spectators will open their envelopes and which spectator will not: four possibilities instead of only one. In that case - assuming that the entertainer is random in his choice of spectators to eliminate - there is only a 25% chance that that a particular spectator will be left; in the restricted case there is a 100% chance of that particular spectator being left.

You're correct about there being only a 20% chance of the bills being in the remaining envelope, you just pointed to the wrong envelope: there's a 20% chance that they will be in the entertainer's envelope, and an 80% chance that they will be in the (remaining) spectator's envelope.

In contract bridge, this sort of situation is called restricted choice.


Posted: Dec 24, 2008 11:35pm
--------------------------------
Quote:
On 2008-12-08 17:26, ddyment wrote:
I felt that Platt's description illustrated how confusing all this can be because he wrote, "... there's a 99% chance the spectator's envelope has the bills and a 1% chance Richard's envelope has them." In fact, there's a 99% chance that Richard's envelope would hold the money, and only a 1% chance that the spectator's does so.

Nope: Platt had it right, assuming each spectator gets a free choice and the entertainer gets the (random) remaining envelope.
therntier
View Profile
Special user
681 Posts

Profile of therntier
Quote:
On 2008-12-04 23:39, ddyment wrote:


So it's always advantageous (in a fair game) for the participant to exchange her original choice with the remaining unexposed one.


Just to be totally correct, there is an often overlooked property of these problems. It is only benefitial to switch if the person exposing the the choiced knows where the prize is. Otherwise, the probabilities are quite different.
Chris K
View Profile
Inner circle
2497 Posts

Profile of Chris K
Knowledge doesn't change probabilities. Could you explain this some more? The only thing I can assume you are referring to is that the exposed choice cannot be the prize. If the prize is exposed (and out of play) the probability does, in fact, change but knowledge of the location, as long as the prize is not exposed, changes nothing.

Probability is not a function of knowledge.
therntier
View Profile
Special user
681 Posts

Profile of therntier
I believe that knowledge does change the probability. You outlined a very good case in which it changes.

Imagine this scenario: Three people, without knowledge of where the prize is, choose one of three doors. You chose door 1. Door 2 is exposed and seen to have no prize. Do you want to switch. If you say yes, how about the person who chose door 3? Does he want to switch as well?

Knowledge does indeed matter.
narcoleptic_insomniac
View Profile
Regular user
Kenosha, WI
140 Posts

Profile of narcoleptic_insomniac
Quote:
On 2009-01-17 20:04, therntier wrote:
I believe that knowledge does change the probability. You outlined a very good case in which it changes... Knowledge does indeed matter.


Yes, "knowledge" does matter in probability theory and can be dealt with via conditional probabilities.

Quote:
On 2009-01-17 20:04, therntier wrote:
Imagine this scenario: Three people, without knowledge of where the prize is, choose one of three doors. You chose door 1. Door 2 is exposed and seen to have no prize. Do you want to switch. If you say yes, how about the person who chose door 3? Does he want to switch as well?


This problem is similar to the Monty Hall problem, except it involves more than one player and it's unclear whether or not the host must first open a door without a prize (as in the MH problem)...
therntier
View Profile
Special user
681 Posts

Profile of therntier
Quote:
On 2009-01-18 03:44, narcoleptic_insomniac wrote:
Quote:
On 2009-01-17 20:04, therntier wrote:
I believe that knowledge does change the probability. You outlined a very good case in which it changes... Knowledge does indeed matter.


Yes, "knowledge" does matter in probability theory and can be dealt with via conditional probabilities.

Quote:
On 2009-01-17 20:04, therntier wrote:
Imagine this scenario: Three people, without knowledge of where the prize is, choose one of three doors. You chose door 1. Door 2 is exposed and seen to have no prize. Do you want to switch. If you say yes, how about the person who chose door 3? Does he want to switch as well?


This problem is similar to the Monty Hall problem, except it involves more than one player and it's unclear whether or not the host must first open a door without a prize (as in the MH problem)...

My point is that, without knowledge, the host cannot whether or not the door he opens does not have the prize. If he opens one and, by chance, it is empty, the problem has now changed.
narcoleptic_insomniac
View Profile
Regular user
Kenosha, WI
140 Posts

Profile of narcoleptic_insomniac
Quote:
On 2009-01-21 17:11, therntier wrote:

My point is that, without knowledge, the host cannot whether or not the door he opens does not have the prize. If he opens one and, by chance, it is empty, the problem has now changed.


Hmmm, when you put it that way the issue seems more philosophical rather than mathematical...

Anyways, I was just mentioning that in probability theory we deal with this kind of situation using conditional probabilities (e.g. Let A be the event the host opens door 2 without prize; let B be the event door 1 contains prize; find P(B|A) etc...)
LobowolfXXX
View Profile
Inner circle
La Famiglia
1192 Posts

Profile of LobowolfXXX
Quote:
On 2009-01-09 15:41, Lemniscate wrote:
Knowledge doesn't change probabilities. Could you explain this some more? The only thing I can assume you are referring to is that the exposed choice cannot be the prize. If the prize is exposed (and out of play) the probability does, in fact, change but knowledge of the location, as long as the prize is not exposed, changes nothing.

Probability is not a function of knowledge.


For another example of the relevance of "knowledge," consider these two similar situations: Let's say you roll two dice, one that you can see, and the other that you can't see. We agree before hand that on the first roll where the die you can see is a 6, we'll bet on whether the other die is also a 6. So you roll 9 times without seeing a 6 on the visible die; on the 10th roll, you get a 6. What are the odds that the hidden die is also a 6? 5-1 against.

Now let's say you roll two dice and you can't see either one, but I can (let's further say just for the sake of this demonstration that I'm honest). Just for fun, let's make them different colors; one is red, and the other is green. We agree that as soon as at least one of the dice is a 6, I'll show it to you, and we'll bet on whether the other die is also a 6. So you roll 9 times, and each time I tell you, "Nope, no 6's." On the tenth roll, I say, "Ah ha!" I produce the green die, which is, in fact, a 6! Bear in mind that according to the conditions of our contest, I'm guaranteed to stop you the FIRST time at least one of the dice is a 6. What are the odds the red die is also a 6? 10-1 against.
"Torture doesn't work" lol
Guess they forgot to tell Bill Buckley.

"...as we reason and love, we are able to hope. And hope enables us to resist those things that would enslave us."
LobowolfXXX
View Profile
Inner circle
La Famiglia
1192 Posts

Profile of LobowolfXXX
Bayes's Theorem, which (oversimplified) compares the relative odds of two different events is a simple way of thinking about the way in which knowledge (for all intents and purposes) "changes" probabilities. Let's say the odds are 40 million to 1 against my winning the Lottery (and, again, I'm unfailingly honest). I call you up and say, "Do you think I won the Lotto yesterday?" You say No; the odds are 40 million to 1 against it.

Let's further say the odds of a 40-year old bridge playing magician/lawyer being diagnosed with eye cancer at a random physical exam are 80 million to 1 against. You know that I bought a Lotto ticket yesterday, and also went to the doctor for a random physical exam. I call you up and say, "Hey, I either won the Lotto yesterday, or I got diagnosed with eye cancer. Do you think I won the Lotto yesterday?" You say Yes; the odds are 2-1 in favor of it.
"Torture doesn't work" lol
Guess they forgot to tell Bill Buckley.

"...as we reason and love, we are able to hope. And hope enables us to resist those things that would enslave us."
S2000magician
View Profile
Inner circle
Yorba Linda, CA
3465 Posts

Profile of S2000magician
Quote:
On 2009-01-30 17:25, LobowolfXXX wrote:
For another example of the relevance of "knowledge," consider these two similar situations: Let's say you roll two dice, one that you can see, and the other that you can't see. We agree before hand that on the first roll where the die you can see is a 6, we'll bet on whether the other die is also a 6. So you roll 9 times without seeing a 6 on the visible die; on the 10th roll, you get a 6. What are the odds that the hidden die is also a 6? 5-1 against.

Now let's say you roll two dice and you can't see either one, but I can (let's further say just for the sake of this demonstration that I'm honest). Just for fun, let's make them different colors; one is red, and the other is green. We agree that as soon as at least one of the dice is a 6, I'll show it to you, and we'll bet on whether the other die is also a 6. So you roll 9 times, and each time I tell you, "Nope, no 6's." On the tenth roll, I say, "Ah ha!" I produce the green die, which is, in fact, a 6! Bear in mind that according to the conditions of our contest, I'm guaranteed to stop you the FIRST time at least one of the dice is a 6. What are the odds the red die is also a 6? 10-1 against.

In the second example you're omitting one crucial bit of information required to compute the odds at 10:1 against: how you decide which die to show when both show sixes. Your calculation of 10:1 against is correct only when you will produce the red die and the green die with equal probability when they both show sixes. If you are known to have have a strong bias toward green - you will always show the green die in preference to the red when you have the choice - then the odds in the second case are 5:1 against (i.e., the probability is 1/6). If you are known to have a strong bias toward red - you will always show the red die in preference to the green when you have a choice - then the odds in the second case are (infinity):1 against (i.e., the probability is 0).
LobowolfXXX
View Profile
Inner circle
La Famiglia
1192 Posts

Profile of LobowolfXXX
Good point; my example carried the unspoken assumption that I would choose either die with equal probability when double sixes come up. If that's the case, then it's 10-1 against.
"Torture doesn't work" lol
Guess they forgot to tell Bill Buckley.

"...as we reason and love, we are able to hope. And hope enables us to resist those things that would enslave us."
S2000magician
View Profile
Inner circle
Yorba Linda, CA
3465 Posts

Profile of S2000magician
Quote:
On 2009-01-30 17:49, LobowolfXXX wrote:
Good point; my example carried the unspoken assumption that I would choose either die with equal probability when double sixes come up. If that's the case, then it's 10-1 against.

In bridge if you're following to a trick and you hold, say, both the queen and the jack in the suit led, you can play either one with equal effect (on that trick). Many beginners in that situation will always play the jack - the lower card. If this tendency is recognized by their opponents it can be used profitably: when both the queen and the jack are missing and that player contributes the queen, his opponent knows with certainty that he does not also hold the jack. The beginner learns to play the queen and jack randomly (with equal probability) in those situations; he learns quicker if he's playing for money.
The Magic Cafe Forum Index » » Magical equations » » Probability question in "Penny For Your Thoughts" (0 Likes)
 Go to page 1~2 [Next]
[ Top of Page ]
All content & postings Copyright © 2001-2021 Steve Brooks. All Rights Reserved.
This page was created in 0.29 seconds requiring 5 database queries.
The views and comments expressed on The Magic Café
are not necessarily those of The Magic Café, Steve Brooks, or Steve Brooks Magic.
> Privacy Statement <

ROTFL Billions and billions served! ROTFL