wu :: forums (http://www.ocf.berkeley.edu/~wwu/cgi-bin/yabb/YaBB.cgi)
riddles >> medium >> Boxing Clever
(Message started by: THUDandBLUNDER on Jun 19th, 2004, 7:34am)

Title: Boxing Clever
Post by THUDandBLUNDER on Jun 19th, 2004, 7:34am
Assume I flip a fair coin n times until it comes up Heads. Now I take two large boxes, in one of which I place $3n and in the other I place $3n+1. I then give you the two closed seemingly identical boxes and allow you to open one box and count the money. At this point you can either take the money in the box you chose or take the unknown amount of money in the other box. Is there a strategy which you may employ in order to maximize your expected gain?

For example, if you open a box and see $3 you should always switch because there will definitely be $9 in the other box. However, if you open a box and see $9, switching might get you $3 or might get you $27.


Title: Re: Boxing Clever
Post by rmsgrey on Jun 19th, 2004, 7:45am
It looks like the best strategy is ::[hide]always switch[/hide]:: though I'd want to go away and do some serious modelling before locking that in as my final answer...

Title: Re: Boxing Clever
Post by Grimbal on Jun 19th, 2004, 10:01am
I'd really like to play that game, since the expected prize is infinite!

Funnily, it seems it is always good to switch.  And this, whatever you see in the box.  So in fact, you can decide to switch before even looking inside.  And since it is always good to switch, it is still better to switch twice, which is the same as not switching.  Hm....  ???

Title: Re: Boxing Clever
Post by towr on Jun 20th, 2004, 7:11am
Switching only really seems to make sense if you count $3 in the firts box, because that gives you information about what's in the other box. Otherwise either one could be the 'high' box

Title: Re: Boxing Clever
Post by THUDandBLUNDER on Jun 20th, 2004, 7:31am
Deja vu, Grimbal?  

Hey, maybe this puzzle should be in Easy.


Quote:
Funnily, it seems it is always good to switch.
But trouble is, if you switch you will always be disappointed.   ???

(Welcome back, towr. Gee, that must be a really tough problem that Icarus is stuck on!)


Title: Re: Boxing Clever
Post by Leon on Jun 20th, 2004, 8:01am

on 06/20/04 at 07:11:18, towr wrote:
Switching only really seems to make sense if you count $3 in the firts box, because that gives you information about what's in the other box. Otherwise either one could be the 'high' box


If the box you pick is anything but $3, it's a 2/3 chance that you did pick the high box.

The reason I think so is:

if you pick a box with $27 in it, then the other box has either $9 (coin was flipped twice ( Tails Heads) making $27 the +1 box) or $81 (coin was flipped 3 times ( T T H) making $81 the +1 box).

Since the chance of the flips having been T H is 1/4, and the chance of the flips having been T T H is 1/8, then there is a 2:1 chance that the $27 you picked is the +1 box ( 1/4 : 1/8 ).

The part I can't figure out is the "maximize your expected gain" and the comment "you will always be disappointed" since even with the 2/3 chance that you have the highest amount already, the extension of 1/3 x possible gain ($27 to $81) less the 2/3 chance of lossing ($27 to $9) the expected outcome is still positive.

Personally, if it was higher than $243 (or $729 maybe), I'd stay since there is only a 6.25% chance that the $243 box is the $3n box (and therefore lower than the other box). But I suppose that technically speaks more to my utility of having $243 in hand vs. losing $162 or gaining $486  than the "maximization" of the gain.




Title: Re: Boxing Clever
Post by towr on Jun 20th, 2004, 8:11am

on 06/20/04 at 07:31:28, THUDandBLUNDER wrote:
(Welcome back, towr. Gee, that must be a really tough problem that Icarus is stuck on!)
welcome back? I don't think I've missed a day here for months :P Well, maybe at weekends, but not recently.

Who I'm really wondering about is James Fingas, he's been missing since new year..

Title: Re: Boxing Clever
Post by towr on Jun 20th, 2004, 8:19am

on 06/20/04 at 08:01:58, Leon wrote:
If the box you pick is anything but $3, it's a 2/3 chance that you did pick the high box.
There are only two boxes. The chance you picked the high one when it isn't a 3 is infinitesimally (basicly not) higher than picking the low one.
Does knowing what is in one of the boxes give you any information about what's in the other, given you have a pair of preselected boxes?

Title: Re: Boxing Clever
Post by Leon on Jun 20th, 2004, 8:30am

on 06/20/04 at 08:19:15, towr wrote:
The chance you picked the high one when it isn't a 3 is infinitesimally (basicly not) higher than picking the low one.


Simplifying a bit, I read: The chance you picked the +1 box is no higher than picking the low one.

By this you mean that it's 50 - 50? I agree with that statement before you have opened it.


on 06/20/04 at 08:19:15, towr wrote:
Does knowing what is in one of the boxes give you any information about what's in the other, given you have a pair of preselected boxes?


Given that the $ put into the boxes was methodically determined, I think so. Why isn't it more likely the one you picked is the +1 box? If it's not, why did T&B saying you'll always be disappointed by switching?



Title: Re: Boxing Clever
Post by towr on Jun 20th, 2004, 8:52am
There is a problem with both 'allways switch' and 'never switch', because if either is true, then it doesn't matter what's actually in the box, it doesn't help to see it.
'never switch' has the added problem that you obviously should switch if you get $3 in the first box.

I'm still not quite sure what the difference is with the other thread..

Title: Re: Boxing Clever
Post by Grimbal on Jun 20th, 2004, 3:33pm
The difference is that in the other thread the possible amounts and the probability of these amounts being in the boxes is not specified.

The probability that you have the high box is not 50% any more after you looked inside, because the amount inside gives you some information.  If you see $3, you know it is not the high box.  If you see any other number 3n, that number could be the higher of the pair n or the lower of the pair n-1, but since the pair n-1 is twice as likely, it is more likely that it is the high value of pair n-1.

But I don't really see how it can be that it is always good to switch, regardless of what is inside.

Title: Re: Boxing Clever
Post by pedronunezmd on Jun 20th, 2004, 8:57pm

Quote:
I'm still not quite sure what the difference is with the other thread...

I assume the other thread referred to is ENVELOPE GAMBLE I (http://www.ocf.berkeley.edu/~wwu/cgi-bin/yabb/YaBB.cgi?board=riddles_hard;action=display;num=1027910317;start=0)?

I think the only real difference here is that you know, in this game, what the absolute lowest amount of money in the box can be, namely "3 dollars", so if you see that amount, always switch...

Otherwise, every other amount you see, you don't know if you are looking at the "x" in the one box or the "3x" in the other box. The exponents are just a fancy way of saying the second box has 3 times the amount in the first box. The puzzle is otherwise the same: expected gain from switching is 0.5 (3x-x) + 0.5 (x-3x) = 0.

Title: Re: Boxing Clever
Post by Leon on Jun 20th, 2004, 9:28pm

on 06/20/04 at 20:57:44, pedronunezmd wrote:
The puzzle is otherwise the same: expected gain from switching is 0.5 (3x-x) + 0.5 (x-3x) = 0.


Don't you mean 0.5 (3x-x) + 0.5(x/3 - x) = 2/3x? Using $27 as the box you opened you will either lose $18 (to $9) or gain $54 (to $81).

0.5 ($81 - $27) + 0.5 ($9 - $27) = $18 ($27 x 2/3).

Why do you feel that the way T&B populated the boxes does not come into play after you know how much is in a given box?

Even then though, the expected outcome is still positive: 1/3 (3x-x) + 2/3(x/3 - x) = 2x/9

Title: Re: Boxing Clever
Post by pedronunezmd on Jun 20th, 2004, 9:35pm

on 06/20/04 at 21:28:39, Leon wrote:
Don't you mean 0.5 (3x-x) + 0.5(x/3 - x) = 2/3x?

No, I definitely don't mean that. See other thread for long discussion regarding labelling the boxes this way.


on 06/20/04 at 21:28:39, Leon wrote:
Why do you feel that the way T&B populated the boxes does not come into play after you know how much is in a given box?

Once the boxes have been populated, if you open the box and see $3, you know you have the low box, so you switch. If you open the box and find anything other than $3, then you don't know if you have the $x from the ($x, $3x) possibility, or the $3x from the ($x, $3x) possibility.

Title: Re: Boxing Clever
Post by Leon on Jun 20th, 2004, 9:43pm

on 06/20/04 at 21:35:18, pedronunezmd wrote:
No, I definitely don't mean that. See other thread for long discussion regarding labelling the boxes this way.


Fair enough, sorry for putting words into your mouth.

But your equation doesn't work since the expected outcome isn't always $0. This is shown with an example.

If I open the $27 box and assume their is a 50% chance the other box is $9 and a 50% chance the other box is $81, then the expected outcome is 0.5 x $18 loss + 0.5 x $54 gain = $18.

Title: Re: Boxing Clever
Post by pedronunezmd on Jun 20th, 2004, 9:48pm
:P

What you are trying to do is label the box that you have now counted as containing x dollars, and then saying that there is a 50% chance that the other box contains x/3 dollars and a 50% change that the other box contains 3x dollars. This is not a correct way to analyze the problem.

The correct way is to realize that you are either in possession of the x or the 3x box from a set of boxes which are labeled x and 3x. Do the calculations from there.

Title: Re: Boxing Clever
Post by BNC on Jun 20th, 2004, 10:20pm

on 06/20/04 at 21:48:50, pedronunezmd wrote:
:P

What you are trying to do is label the box that you have now counted as containing x dollars, and then saying that there is a 50% chance that the other box contains x/3 dollars and a 50% change that the other box contains 3x dollars. This is not a correct way to analyze the problem.


And why is that "not a correct way"? Do you have a reason?

I can present you with another game (you tell me if it's similar). I'll flip a fair coin. If it's head, you give me $6. If tails, I'll give you $18. Want to play?

Title: Re: Boxing Clever
Post by towr on Jun 21st, 2004, 1:08am
Given a pair of boxes ($3n, $3n+1)
you pick one of them, and based on it's value decide wether you switch.
If you picked $3, then you know which of the two you have, otherwise you don't. You might have picked $3n, in which case you want to switch, or you might have picked $3n+1 in which case you wouldn't want to switch. But it's not like there's a chance $3n-1 or $3n+2 are involved, because there are only two given boxes.

Title: Re: Boxing Clever
Post by BNC on Jun 21st, 2004, 2:33am
It is beoming a duplicate thread, is it not?  ::)

You opened a box, finding $3n. Therefore, you know there were at least n tails. Now the question is: was the next one heads or tails (50/50). If heads, you "shouldn't switch" but the penalty for switching is $2*3n-1. If tails, you should switch, gaining 2*3n. Your gain (50%) is larger than your possible loss (50%), hence you should switch.

Title: Re: Boxing Clever
Post by towr on Jun 21st, 2004, 3:11am
But then you should switch regardless of what is in the box, so you needn't look in the box before deciding to switch, which doesn't make any sense as both boxes are then equivalent.

Title: Re: Boxing Clever
Post by BNC on Jun 21st, 2004, 3:19am

on 06/21/04 at 03:11:56, towr wrote:
But then you should switch regardless of what is in the box, so you needn't look in the box before deciding to switch, which doesn't make any sense as both boxes are then equivalent.


Yes, indeed.

I'll repeat something I wrote in the other thread. I'm not sure if it's true, but some years ago, when I took a basic statistics couse at the uni, the teaching assistant told us that this is a recognized paradox.


Title: Re: Boxing Clever
Post by pedronunezmd on Jun 21st, 2004, 6:33am

on 06/20/04 at 22:20:09, BNC wrote:
I can present you with another game (you tell me if it's similar). I'll flip a fair coin. If it's head, you give me $6. If tails, I'll give you $18. Want to play?

Yes, I will play this game, and in fact, I will play this game with a $12 bet. That is the expected payoff of this game, so I would play for $12 and expect to break even in the long run.


on 06/21/04 at 03:19:25, BNC wrote:
I'll repeat something I wrote in the other thread. I'm not sure if it's true, but some years ago, when I took a basic statistics couse at the uni, the teaching assistant told us that this is a recognized paradox.

I agree that this is becoming a duplicate thread. However, I disagree that this is a paradox. There is a flaw in labeling one box as containing x, and the other box has a 50% chance of (3x or x/3). It is not a paradox if the reasoning is flawed.

Title: Re: Boxing Clever
Post by towr on Jun 21st, 2004, 8:14am
It's a paradox in that it's seemingly contradictory. not actually contradictory.
You can f.i. get an answer from simulation. Or by using an approriate markov model.
In more classical analysis the problem lies in asking the right question, because otherwise you won't get the right answer.

Title: Re: Boxing Clever
Post by pedronunezmd on Jun 21st, 2004, 11:05am

on 06/21/04 at 08:14:26, towr wrote:
It's a paradox in that it's seemingly contradictory. not actually contradictory.

I have never heard of that definition of a paradox. "Seemingly contradictory" when you use incorrect reasoning is better called "incorrect" rather than "paradox". A paradox is better defined as seemingly contradictory statement derived from otherwise acceptable premises.

For example, the riddle where it is "proved" that "1 = 0" is not really a paradox, it is just an invalid proof. derived from unacceptable premises.

Title: Re: Boxing Clever
Post by Leon on Jun 21st, 2004, 11:14am

on 06/20/04 at 21:43:59, Leon wrote:
But your equation doesn't work since the expected outcome isn't always $0. This is shown with an example.

If I open the $27 box and assume their is a 50% chance the other box is $9 and a 50% chance the other box is $81, then the expected outcome is 0.5 x $18 loss + 0.5 x $54 gain = $18.


What about the above is incorrect?

Title: Re: Boxing Clever
Post by towr on Jun 21st, 2004, 11:38am

on 06/21/04 at 11:05:07, pedronunezmd wrote:
I have never heard of that definition of a paradox.
http://www.m-w.com/cgi-bin/dictionary?va=paradox (http://www.m-w.com/cgi-bin/dictionary?va=paradox)


Title: Re: Boxing Clever
Post by BNC on Jun 21st, 2004, 3:18pm

on 06/21/04 at 06:33:33, pedronunezmd wrote:
Yes, I will play this game, and in fact, I will play this game with a $12 bet. That is the expected payoff of this game, so I would play for $12 and expect to break even in the long run.


So how is it different from openning the box to find $9? You still have 50% of the othercontaining $3, and 50% of $27.

Title: Re: Boxing Clever
Post by pedronunezmd on Jun 21st, 2004, 4:42pm
Towr, there is a difference between a paradox that is seemingly contradictory (derived from valid assumptions) yet is perhaps but not proven to be true, and an incorrect statement that is obviously wrong. If a 5-year old told you that "4+4=9", that is not a paradox, that is just wrong. It is not "seemingly contradictory" as in a paradox because it is just plain "wrong". (Please don't ask me to prove that "4+4=9" is wrong.) Probably the better term for the description of this riddle is "counterintuitive" rather than "paradox"?

BNC, that reasoning (50% chance that the other box is x/3 or 3x) is wrong for the same reasons as in ENVELOPE GAMBLE I (http://www.ocf.berkeley.edu/~wwu/cgi-bin/yabb/YaBB.cgi?board=riddles_hard;action=display;num=1027910317;start=0). I didn't do a good job explaining it there, so I probably won't do a good job explaining it here, but it is wrong, that I am certain.

Title: Re: Boxing Clever
Post by pedronunezmd on Jun 21st, 2004, 4:46pm
Uh-oh, my apologies to BNC. I just re-looked at the posts in Envelope Gamble 1 and see that you were quite involved in that thread too. Sorry for insinuating you should read that thread...  :-[

Title: Re: Boxing Clever
Post by Leonid Broukhis on Jun 21st, 2004, 6:49pm

on 06/21/04 at 02:33:36, BNC wrote:
It is beoming a duplicate thread, is it not?  ::)

You opened a box, finding $3n. Therefore, you know there were at least n tails. Now the question is: was the next one heads or tails (50/50). If heads, you "shouldn't switch" but the penalty for switching is $2*3n-1. If tails, you should switch, gaining 2*3n. Your gain (50%) is larger than your possible loss (50%), hence you should switch.



No. Therefore you know there were exactly n or n+1 tails. Now the question is: was the next one heads, or the next two "tails, heads" (2/1).  

Title: Re: Boxing Clever
Post by BNC on Jun 21st, 2004, 9:43pm

on 06/21/04 at 18:49:33, Leonid Broukhis wrote:
No. Therefore you know there were exactly n or n+1 tails. Now the question is: was the next one heads, or the next two "tails, heads" (2/1).  


Ahha, that makes a lot of sense! (I only wish it would apply to the original problem -- I'll have one less itching problem on my mind.

Title: Re: Boxing Clever
Post by towr on Jun 22nd, 2004, 12:45am

on 06/21/04 at 16:42:27, pedronunezmd wrote:
Towr, there is a difference between a paradox that is seemingly contradictory (derived from valid assumptions) yet is perhaps but not proven to be true, and an incorrect statement that is obviously wrong.
What you find obvious might not be obvious to someone else. And so what is a paradox to someone else might not be one to you. Things can simply seem different to different people.

Title: Re: Boxing Clever
Post by pedronunezmd on Jun 22nd, 2004, 4:01am

on 06/22/04 at 00:45:06, towr wrote:
What you find obvious might not be obvious to someone else. And so what is a paradox to someone else might not be one to you. Things can simply seem different to different people.

Towr, then I find your last post to be wrong a paradox.  :P

Seriously, though, I think I know what you mean. By that definition, something is a paradox to anyone that can't explain the apparent contradiction.

Title: Re: Boxing Clever
Post by rmsgrey on Jun 22nd, 2004, 6:25am
The major difference between this thread and the Envelope Gamble is that the envelope gamble requires you to decide whether to switch before opening the envelope, while this thread makes you open the box first and then decide whether to switch. Opening the box provides information you didn't have before opening it.

Title: Re: Boxing Clever
Post by THUDandBLUNDER on Jun 22nd, 2004, 6:54am
Well, the question was, "Is there a strategy which you may employ in order to maximize your expected gain?"
As the expected payout is infinite using any strategy, I guess the answer is 'No'.

How about maximising our chance of choosing the box containing the higher amount? While it's true that the probability of picking the box with the larger amount is 1/2, the conditional probability that the other box contains more money, given that the first box contained x dollars is 1 if x = 3 and 1/3 otherwise. That is, ignoring the case x = 3, there will be a 1/3 chance the other box contains 3 times more and a 2/3 chance it contains 3 times less. This means that, if we switch when we see $3 and stick otherwise, we will have no cause for regret 75% of the time.


Title: Re: Boxing Clever
Post by towr on Jun 22nd, 2004, 7:26am
How do you get 75% You get less 66.7% of the time and probably regret that each time, and not the other 33.3%

Title: Re: Boxing Clever
Post by Leon on Jun 22nd, 2004, 7:56am

on 06/22/04 at 07:26:14, towr wrote:
You get less 66.7% of the time and probably regret that each time, and not the other 33.3%


You are right. :o My prior equation was backwards. 1/3 of the time you will have the higher amount and therefore lose when you switch. 2/3 of the time you will have the lower amount and therefore gain when you switch.

You lose 2/3 of the amount (e.g. $27 -> $9) or gain 2 times the amount ($27 -> $81).

1/3 x -2/3 + 2/3 x 2 = 10/9 expected gain.

If you open the $27 box, you have a 1/3 chance of losing $6 and a 2/3 chance of gaining $54. Expected gain is $30 ($57 total in hand)

So, ignoring your personal utility, you should switch every time.

Title: Re: Boxing Clever
Post by THUDandBLUNDER on Jun 22nd, 2004, 8:01am

on 06/22/04 at 07:26:14, towr wrote:
How do you get 75% You get less 66.7% of the time and probably regret that each time, and not the other 33.3%

If we observe $3 in the 1st box, the conditional probability that the other box contains the larger amount is 1.
So we definitely switch.

If we observe something other than $3 in the 1st box, the conditional probability that the other box contains the larger amount is 1/3. So we don't switch, and we get the larger amount 2/3 of the time.

So 1/4 of the time we see $3 and switch to get a guaranteed larger amount.
And 3/4 of the time we see something else and don't switch.
2/3 of this 3/4 of the time, and so 1/2 of the time, we get it right and get the larger amount.

So in total, 1/4 + 1/2 = 3/4 of the time we get the larger amount.  


Title: Re: Boxing Clever
Post by rmsgrey on Jun 22nd, 2004, 8:37am
So if I open the box with 9 in 3 times, and always switch, I end up with 27+3+3=33>27. If I never switch, I end up with 9+9+9=27<33.

Given that I know I've opened the box with 9 in (for some value of 9[ne]3), it looks like simple conditional probability. Before I look in the box, I have one of two boxes, one of which contains x, taken from distribution X, and the other contains y, taken from identical distribution Y, non-independently, so that either x=3y or y=3x. At that point, the two boxes are completely equivalent. Once I open one, it has a definite value, n, and the other has a value z, taken from a distribution Zn where Z3 is constant {9} and otherwise Zn is 2:1 over {n/3,3n}.

Opening the box makes a big difference to the situation - which is where the apparent paradox comes from.

Title: Re: Boxing Clever
Post by THUDandBLUNDER on Jun 22nd, 2004, 9:08am

Quote:
Opening the box makes a big difference to the situation - which is where the apparent paradox comes from.

Yes, trying to calculate the expected gain from switching for a given observed X is actually a calculation of the conditional expectation of a random variable with infinite expectation, whereas this function is defined for only those random variables with a finite expectation.



Title: Re: Boxing Clever
Post by Leon on Jun 22nd, 2004, 9:13am

on 06/22/04 at 07:56:19, Leon wrote:
You are right. My prior equation was backwards. 1/3 of the time you will have the higher amount and therefore lose when you switch. 2/3 of the time you will have the lower amount and therefore gain when you switch.

You lose 2/3 of the amount (e.g. $27 -> $9) or gain 2 times the amount ($27 -> $81).

1/3 x -2/3 + 2/3 x 2 = 10/9 expected gain.

If you open the $27 box, you have a 1/3 chance of losing $6 and a 2/3 chance of gaining $54. Expected gain is $30 ($57 total in hand)

So, ignoring your personal utility, you should switch every time.


So is the above not valid? Is it the wrong methodology, i.e. I should use conditional probabilty instead?

.....looking to the sky to see if it will split open......

Title: Re: Boxing Clever
Post by Sameer on Jun 22nd, 2004, 9:23am
This looks lot like Monty Hall problem http://mathworld.wolfram.com/MontyHallProblem.html

Title: Re: Boxing Clever
Post by Grimbal on Jun 22nd, 2004, 12:09pm
For what it is worth, I made a simulation.  I ran the game 10^9 times.  The boxes are selected as prescribed (flipping a coin), and mixed (switched 1/2 of the time).  The player looks at the box and maybe switches.  I compute the average prize if the player only switches when he sees $3, and the average gain if instead he switches always.

I got an average prize without switching of 1948201 and an average gain of switching of 845978.  But these numbers don't mean much since the sums don't seem to converge.  At 5*10^8, I was at 934812 average prize and -28236 average gain for switching.

Printing out the averages every 10^6 games, I can see that the values decrease slowly, but sometimes huge values have a higher impact than all previous values.  And it goes increasing with time.

The conclusion is that it is certainly an interesting game, but the average prize can not be computed.  If you play a lot, the result whether on average it was right or not to switch is dominated by one huge value that can be positive or negative.

Title: Re: Boxing Clever
Post by pedronunezmd on Jun 22nd, 2004, 4:35pm
After running a few simulations myself, I see now what you mean. Yes, the results seem to always be dominated by 1 very very large gain or loss, which sways the grand total over all the other trials combined. Yikes. Computer simulation may not be much help here.

Title: Re: Boxing Clever
Post by pedronunezmd on Jun 22nd, 2004, 7:51pm

on 06/22/04 at 08:37:18, rmsgrey wrote:
So if I open the box with 9 in 3 times, and always switch, I end up with 27+3+3=33>27. If I never switch, I end up with 9+9+9=27<33.

Given that I know I've opened the box with 9 in (for some value of 9[ne]3), it looks like simple conditional probability. Before I look in the box, I have one of two boxes, one of which contains x, taken from distribution X, and the other contains y, taken from identical distribution Y, non-independently, so that either x=3y or y=3x. At that point, the two boxes are completely equivalent. Once I open one, it has a definite value, n, and the other has a value z, taken from a distribution Zn where Z3 is constant {9} and otherwise Zn is 2:1 over {n/3,3n}.

Opening the box makes a big difference to the situation - which is where the apparent paradox comes from.

So if I understand you correctly, rmsgrey, the way you would play the game is to always open the box chosen, count the money, but then switch at that point to the other box? And if I further understand you correctly, if you counted n dollars in the first box, you expect, on average, to see 11n/9 dollars in the other box?

Title: Re: Boxing Clever
Post by rmsgrey on Jun 23rd, 2004, 7:03am
That sounds about right

Title: Re: Boxing Clever
Post by towr on Jun 23rd, 2004, 7:24am
And you'd expect that to be the case both the 50% of the time when you choose box A (and switch to B) and the 50% of the time you choose box B (and switch to A)?

Title: Re: Boxing Clever
Post by towr on Jun 23rd, 2004, 7:45am
Would we still have the same problem if instead of having
3n and 3n+1 (dollars) in the boxes for n flips we had 0.5n and 0.5n+1 (kg gold) in the boxes?
You'd still have a supposed positive expected value for switching, 2/3*2 n + 1/3*0.5 n = 3/2 n, but at least the expected value of the game wouldn't be infinite any more (which makes analysing it much easier).
And the same arguments seem to apply.

Title: Re: Boxing Clever
Post by rmsgrey on Jun 23rd, 2004, 8:02am

on 06/23/04 at 07:24:33, towr wrote:
And you'd expect that to be the case both the 50% of the time when you choose box A (and switch to B) and the 50% of the time you choose box B (and switch to A)?

But since any given value apart from 3 is more likely to come up in box B than box A, I expect to be switching from box B to box A 2/3 of the time that I don't open a box holding 3 (when I'm always switching from box A to box B)

The key point is that, having opened a box, I change the information I base my choice on. The fact is that before opening the box switching is neutral, but, in all cases, once the box is opened, switching is positive. Compare with Simpson's Paradox.

The fractional variant shouldn't change the end result - again switching is neutral before you open the box and positive afterwards.

Title: Re: Boxing Clever
Post by Leon on Jun 23rd, 2004, 8:12am

on 06/23/04 at 07:45:08, towr wrote:
Would we still have the same problem if instead of having
3n and 3n+1 (dollars) in the boxes for n flips we had 0.5n and 0.5n+1 (kg gold) in the boxes?
You'd still have a supposed positive expected value for switching, 2/3*2 n + 1/3*0.5 n = 3/2 n, but at least the expected value of the game wouldn't be infinite any more (which makes analysing it much easier).
And the same arguments seem to apply.


.5n+1 is smaller than .5n, whereas 3n+1 is larger than 3n so it's not the same expected value.

Title: Re: Boxing Clever
Post by towr on Jun 23rd, 2004, 8:38am

on 06/23/04 at 08:12:39, Leon wrote:
.5n+1 is smaller than .5n, whereas 3n+1 is larger than 3n so it's not the same expected value.
I never implied it was, only that switching would lead to a higher expected value for a given n.


on 06/23/04 at 08:02:09, rmsgrey wrote:
But since any given value apart from 3 is more likely to come up in box B than box A
How do you figure that?
Either box is equally likely to be chosen, so which has label A or B doesn't matter.

n  | boxes | only switch on 3 | always switch
1  | 3 , 9 |         9        |       6  
2  | 9 ,27 |        18        |      18  
3  |27 ,81 |        54        |      54
etc
So you should only switch on $3, and you'll have $3 higher expected outcome for the game (but infinity +3 is infinity)

Title: Re: Boxing Clever
Post by Leon on Jun 23rd, 2004, 8:59am

on 06/23/04 at 08:38:53, towr wrote:
I never implied it was, only that switching would lead to a higher expected value for a given n.



on 06/23/04 at 08:38:53, towr wrote:
supposed positive expected value for switching, 2/3*2 n + 1/3*0.5 n = 3/2 n


But you used the same probabilties and they are reversed for 0.5 vs. 3.

With 3n where you have a 2/3 chance of gaining and 1/3 chance of losing since 3n is 2:1 more likely than 3n+1 and thus swtich from n to n+1 is a gain and vice versa

With .5n]/sup] you have a 1/3 chance of gaining and a 2/3 chance of losing since 0.5[sup]n is 2:1 more likely than 3n+1 and thus swtich from n to n+1 is a loss and vice versa.

Title: Re: Boxing Clever
Post by towr on Jun 23rd, 2004, 9:03am
2X = 0.5-1X (X is the amount you've found in the first box, I probably shouldn't have called it n before)

So you'd expect 2/3 * 0.5-1X + 1/3 * 0.5+1X = 3/2 X

Title: Re: Boxing Clever
Post by Leon on Jun 23rd, 2004, 9:43am

on 06/23/04 at 09:03:11, towr wrote:
2X = 0.5-1X (X is the amount you've found in the first box, I probably shouldn't have called it n before)

So you'd expect 2/3 * 0.5-1X + 1/3 * 0.5+1X = 3/2 X


I know I must be trying your patience....

In th 0.5 version. Do you think there is a 2/3 chance you gain or lose when you switch?

If you open the box and it has $0.125 (1/8), there is a 2/3 chance you have the H H T box. The other box would be $0.0625 (the H H H T box).  If you switch in this case you lose $0.0625 (2/3 x - $1/16 ).

There is also a 1/3 chance you are holding the n+1 box, which would make the other box $0.25 since flipping H T is twice as likely as flipping H H T.  If you switch you gain $0.125 (1/3 x $1/8).

2/3 x -$1/16 + 1/3 x $1/8 =  -2/48 + 1/24 = 0


You have used 2/3 chance of gain and 1/3 chance of loss.

If any of my assumptions are wrong, let me know and I'll stop bugging you.

Thanks.


Title: Re: Boxing Clever
Post by pedronunezmd on Jun 23rd, 2004, 10:32am
I know that "how you get to the answer" is more important than "what is the answer" but...

In the 3n and 3n+1, I have calculated the following as a fair way to play the game:

1. The coins are flipped n times and the boxes filled.
2. You choose randomly between the two boxes.
3. You open your box and count d dollars.
4. Once you have counted d dollars, you must continue playing. You are not able to void this trial and refuse to pay.
5. You now get to decide whether to keep the d dollars or else change envelopes and keep whatever is in the other.
6. You pay the cost to play:
    6a. If d = 3, you pay $15 to play this round, but of course,
         you switched to the other box this round.
    6b. If d > 3, you pay whatever is in the box that
         you did not keep, to play this round.

Expected outcome of game, using this cost method: $0, which means this is a fair game. Any bet that requires you to pay less than this stated method will result in a game favorable to the player. Any bet that requires you to pay more than this stated method will result in a game favorable to the house.

Why do you have to pay $15 if you count 3 dollars? Because that exactly cancels out the benefit you get by knowing that "3 always means switch boxes". In all other instances, you gain no knowledge by counting the money in the box, contrary to what has been previously stated.

[edit]I did not state the cost to play for d >3 correctly in the original posting.[/edit]

Title: Re: Boxing Clever
Post by pedronunezmd on Jun 23rd, 2004, 3:32pm
Well, in retrospect, now that I look at it, I don't think my previous post adequately creates a "fair game", but I'll leave it there. What would be a "fair" bet in this game, any ideas?

Using rmsgrey's view that he would expect to find 11n/9 in the other box, would you be willing to always pay 11n/9 as a fair cost for the game?

Title: Re: Boxing Clever
Post by Grimbal on Jun 23rd, 2004, 5:56pm
I thought of something that might help explaining the paradox.

Logically, it doesn't matter whether you switch or not before you know what is inside one box.  But then, it seems that once you have looked, it is always good to switch, whatever you have seen.  How can it be?

After you have looked, the gain of switching is always positive.  But from there, to conclude that it is good to switch without knowledge of what is inside the box, you must average all these positive terms.  And since the sum does not converge, it is meaningless.  It is like computing 1-2+3-4+5-6+... by grouping terms by pairs.  You could say: sum = 1+(-2+3)+(-4+5)+... = 1+1+1+... = infinity, or you could say: sum = (1-2)+(3-4)+(5-6)+... = (-1)+(-1)+(-1)+... = -infinity.

In short, it is well possible that with knowledge of the value inside one box it is always interesting to switch, while without that knowledge, the value of switching is undefined.

Title: Re: Boxing Clever
Post by pedronunezmd on Jun 23rd, 2004, 8:31pm
Warning, long post ahead, but I think I have the best strategy for this riddle now:

Grimbal, I think that series you are talking about is the one I've been playing around with all day myself. It is quite an interesting series of fractions.

I went back to the math of the problem, which is really not my strength, since I've had little math training beyond a few courses in college.

I realized that for any pair put into the boxes, (x, 3x), the expected outcome of picking randomly between the 2 boxes is 1/2 (x + 3x) = 2x. It therefore makes sense that a fair price to play this game would simply be to pick your box, count the money inside if you'd like, then keep the box you picked or change to the other, but then have to pay 2 times whichever box had the lower amount once both boxes have been opened. You cannot express the fair amount to pay for the game in terms of the box that you open, but only in terms of the lowest amount in the 2 boxes that are being chosen from.

Now if you don't gain any information after opening the one box and counting the money inside, then the gain from either switching or keeping the box you counted is zero.

If you gain information, such as "I know for sure I am looking at the highest or lowest value possible", then that will affect your decision whether to switch or keep.

In this game, when you see "$3", you know you are looking at lowest value possible, and you switch. When you see anything other than "$3" you gain no information about whether you are looking at lower or higher value of the 2 boxes. It makes no difference if you switch or keep.

However, when running computer simulations, I found a definite advanage to "switch when you see 3, keep otherwise". So why?

The reason is simple. If you limit the number of trials to a finite number, you are in essence limiting the maximum value of n, which is the numhber of flips until you see heads. Try it. If you are running 10^9 trials, you probably will never see any n>30, maybe once you will see a n=31. Whatever the upper limit was, in your finite number of trials, it turns out that, if you know you are looking at the highest possible amount in the box, then you should keep it. How do you know when you are looking at the highest dollar amount? Simple, always keep anything other than the known lowest amount.

It also turns out that the higher the value of n, the higher the amount in the box, which increases exponentially. These high values of n tend to skew the total results completely.

Therefore, if you only have time to play the game a finite number of times, the optimal strategy is to always switch when you see the lowest possible amount, and always keep the box you have otherwise.

Of course, all this is just theoretical. Perhaps someone can run a simulation with 10^9 trials again, each time, the cost of the game is 2 times the lower of the 2 amounts placed in the boxes, and you get to either keep the amount in the box you chose or keep the amount in the other box. This should verify that "switch if 3, keep otherwise" is the best strategy for the finite game. For the infinite game, "switch if 3, doesn't matter otherwise" will be correct.

Title: Re: Boxing Clever
Post by SWF on Jun 23rd, 2004, 8:33pm
The infinite expectation is due to infinite payout for an event with probability zero. Suppose you change the rules a little for picking n: instead of flipping forever, if you don't have heads by toss number M, set n to M and give up on getting heads. With that rule modification everything is finite and the expectation (prior to looking in any box) for the various strategies is:

For never (or always) switching after looking in the first box:
E= 8*(3/2)M - 6

For switching only if the first box has $3:
E= 8*(3/2)M - 4.5      (exception: E=9 when M=1)

For always switching except when the first box contains $3M+1:
E= 10*(3/2)M - 6

Always switching except when you have the max possible is the best strategy. When M goes to infinity (to approach the original game), all strategies have infinite expectation (even if you don't switch on $3).  But especially if playing a finite number of times I'd use the switching strategy, and always switch unless the first box had an infinite prize. That last case of not switching when the prize is maximum is what seems to be causing the confusion as M goes to infinity.

If the first box opened contained an infinite prize, the other one should too, but why risk switching-- a bird in hand is worth three in the bush.

Title: Re: Boxing Clever
Post by pedronunezmd on Jun 23rd, 2004, 9:23pm
I don't doubt that changing the rules of the game (limiting the highest value of n) will change the optimal strategy.

Going back to the original game, based on my previous post: for the infinite game, a fair price to play the game would be ($1.50 plus double the amount in the smaller box of the 2 you are presented.)

Examine the following. What is the expected outcome of the game if you have to pay double the amount in the smaller box and keep whichever box you choose? The optimal strategy is to switch if you see $3 and keep the box otherwise. Therefore the expected outcome would be:

P(box 3,9)E(3,9) + P(box 9,27)E(9,27) + P(box 27,81)E(27,81) ...

which is equal to:
1/2 (3 + 3)/2 + 1/4 (-9 + 9)/2 + 1/8 (-27 + 27)/2 ...

which is equal to
3/2 + 0 + 0 ... = 3/2

Therefore if you pay $1.50 to play the game, you break even in an infinite number of games. [edit]Forgot a factor of one half, put thsi back in equation.[/edit]

Title: Re: Boxing Clever
Post by pedronunezmd on Jun 24th, 2004, 5:22am
I am going to be out of town for a few days, so I won't get to see the replies on this thread until then. But I will make one final post before going:

I am sure that once the coin has been flipped n times and the boxes are filled with money, the only time when the choice of switching or keeping your selected box becomes relevant is in the case when you are given the (3,9) pair of boxes. If you use the strategy "switch with 3, else keep your box". That is the only pair of boxes where the expected outcome (if the cost of the game is 2 times the lower box) is +3. All other pairs of boxes that you are given have zero gain from switching or keeping, so therefore always keeping any box that you pick gives you no disadvantage for any box pair and gives you an advantage with the (3,9) box pair.

Try running the following computer simulation:

First, generate a box pair by flipping the coins and fixing the amounts in the box pair.

Second, run a large number of trials at this point where you: randomly choose one of the 2 boxes; then, design whatever algorithm (such as always switch other than 3, or always keep other than 3) based on knowing what amount is in the box you chose, for which box you will keep; and last, you keep the amount in the box you chose and have to pay 2 times the amount in whichever box had the smaller amount, except that if the smaller amount in the 2 boxes is "3" then you have to pay $9. Using this method, you will see that the outcome for any given pair ox boxes (once fixed) and given this cost of playing the game is zero.

Title: Re: Boxing Clever
Post by rmsgrey on Jun 24th, 2004, 9:51am
It's not in contention that the game just described has no benefit to switching except in the (3,9) case, when switching for the 9 loses the gain from knowing to switch with the 3. But the game just described isn't the original game in the thread.

Try the following procedure instead:

1) Randomly generate a pair of boxes.
2) Pick one and remember the value inside.
3) Randomly generate a large number of pairs of boxes and pick one of each pair. 4) Discard all pairs where the one you picked wasn't the same as the value you're remembering (those situations you know differ from your current attempt).
5) You now have a (large) number of pairs of boxes where you opened one to find the same value inside (representing the set of possible situations you could be in - with a large amount of redundancy to minimise the effect of the random generation). Test each strategy with this sample.

You should find that switching always gains.

Title: Re: Boxing Clever
Post by Three Hands on Jun 24th, 2004, 3:48pm
Just a thought from a non-mathematical kind of person, but shouldn't some consideration for the probability of the box you open containing a larger value be factored in. After all, it becomes increasingly unlikely with higher amounts opened in your box that the other box contains a larger amount. Just guessing, but I would think a lower value would be twice as likely as a higher value.

OK, so you get (3,9) half the time, (9,27) 1/4 of the time, and (27,81) 1/8 of the time. If you see 3, then you will always gain by switching. If you see 9, you lose six 2/3 of the time, and gain eighteen 1/3 of the time. So over three times, you get an expected gain of 6. If you see 27, then you lose eighteen 2/3 of the time, but gain fifty-four 1/3 of the time. So over three times of switching, the expected gain is 18. If you never switch, then the results are the opposite. However, if you are only playing this once, you might want to alter your decisions, given that, other than the case of opening a 3, you are more likely to lose than gain.

So that would be a case of, if you play this game several times, switching each time will generally let you gain. However, for one-off gambles, you seem to be better off sticking with anything other than a 3, in terms of how likely it is you will gain rather than lose - even though you avoid the chance of possibly winning much more than you would expect to lose. I guess it comes down to how much you want to get out of the game in the end, though.

Title: Re: Boxing Clever
Post by SWF on Jun 25th, 2004, 5:17pm
For anyone who thinks you should switch on $3 but not otherwise, why not switch on $9 too:

Compare the basic strategy (call it S) of never switching (same as always switching) to the strategy, SM, which is to switch if the first box contains $3M or less, but not otherwise.

The probability of ending up with any given prize amount other than $3M or $3M+1 is the same for S and SM. The only difference between the two is in the case with probability 1/2M+1, where instead of S giving you $3M, SM gets you $3M+1. Therefore, on average, the SM approach is better than S by (3M+1-3M)/2M+1 = (3/2)M.

Title: Re: Boxing Clever
Post by pedronunezmd on Jun 26th, 2004, 6:20pm

on 06/24/04 at 09:51:01, rmsgrey wrote:
It's not in contention that the game just described has no benefit to switching except in the (3,9) case, when switching for the 9 loses the gain from knowing to switch with the 3. But the game just described isn't the original game in the thread.

Try the following procedure instead:

1) Randomly generate a pair of boxes.
2) Pick one and remember the value inside.
3) Randomly generate a large number of pairs of boxes and pick one of each pair. 4) Discard all pairs where the one you picked wasn't the same as the value you're remembering (those situations you know differ from your current attempt).
5) You now have a (large) number of pairs of boxes where you opened one to find the same value inside (representing the set of possible situations you could be in - with a large amount of redundancy to minimise the effect of the random generation). Test each strategy with this sample.

You should find that switching always gains.

Very true. I likewise don't disagree with that way of playing the game and the benefit of always switching in that case.

Title: Re: Boxing Clever
Post by Leon on Jun 29th, 2004, 5:40pm
Regarding the computer simulations, I think they do have a place. If you ran it such that you disregard the actual $ amounts, and instead work with points like so:

switch and gain, you receive +2 points (since you went from x to 3x)
switch and lose, you lose -2/3 points (since you went from y to y/3)

This will mitigate the large swings due to the exponetial and show the expected return for all items other than when you open $3.

You don't even need to run the not switching strategy since it would be 0 total points (got z and kept z = no gain).

[edit] alternatively, depending on how it is coded already (if Grimbal and pedro still have the simulations, it may be easier  to just divide the gain or loss by the amount in the box. This will give the same end result ( $2 or ($0.66) ) [/edit]

Title: Re: Boxing Clever
Post by pedronunezmd on Jun 29th, 2004, 5:53pm
In that case, you don't even need to run the computer simulation for "always switching" either. It should be obvious that in an infinite number of runs, each starting from a fresh coin flip...

Calculate the gain from switching...

The combination (3,9) will come up exactly 1/2 of the time. Then pick a box and open it.
 1. Half of these will be 3, switch to 9, gain is +6.
 2. Half of these will be 9, switch to 3, gain is -6.
for a total of +0 for this combination whenever it occurs.

The combination (9,27) will come up exactly 1/4 of the time. Then pick a box and open it.
 1. Half of these will be 9, switch to 27, gain is +18.
 2. Half of these will be 27, switch to 9, gain is -18.
for a total of +0 for this combination whenever it occurs.

The combination (27,81) will come up exactly 1/8 of the time. Then pick a box and open it.
 1. Half of these will be 27, switch to 81, gain is +54.
 2. Half of these will be 81, switch to 27, gain is -54.
for a total of +0 for this combination whenever it occurs.

...etc, etc, etc, so that total gain from always switching for all subsets of combinations will combine to zero.

Title: Re: Boxing Clever
Post by pedronunezmd on Jun 29th, 2004, 6:00pm
And if you want to use just a "point" system, I wouldn't use +2 or -2/3 points, I would instead label each box as either x or 3x, you get one, then...

If you switch from 3x to x, you lose 2x from switching.
If you switch from x to 3x, you gain 2x from switching.

And since you will get the x box half the time and the 3x box half the time, your total expected gain is zero from switching.

Title: Re: Boxing Clever
Post by Leon on Jun 29th, 2004, 6:47pm

on 06/29/04 at 17:53:21, pedronunezmd wrote:
In that case, you don't even need to run the computer simulation for "always switching" either. It should be obvious that in an infinite number of runs, each starting from a fresh coin flip...

Calculate the gain from switching...

The combination (3,9) will come up exactly 1/2 of the time. Then pick a box and open it.
 1. Half of these will be 3, switch to 9, gain is +6.
 2. Half of these will be 9, switch to 3, gain is -6.
for a total of +0 for this combination whenever it occurs.

The combination (9,27) will come up exactly 1/4 of the time. Then pick a box and open it.
 1. Half of these will be 9, switch to 27, gain is +18.
 2. Half of these will be 27, switch to 9, gain is -18.
for a total of +0 for this combination whenever it occurs.

The combination (27,81) will come up exactly 1/8 of the time. Then pick a box and open it.
 1. Half of these will be 27, switch to 81, gain is +54.
 2. Half of these will be 81, switch to 27, gain is -54.
for a total of +0 for this combination whenever it occurs.

...etc, etc, etc, so that total gain from always switching for all subsets of combinations will combine to zero.


Exactly. 9, 27 will come up 25% of the time. 27, 81 will come up 12.5% of time.

So, is it true that when you open a box with $27 in it, having one of [9, 27] is twice as likely as having one of [27, 81]?

Title: Re: Boxing Clever
Post by pedronunezmd on Jun 29th, 2004, 6:56pm

on 06/29/04 at 18:47:17, Leon wrote:
Exactly. 9, 27 will come up 25% of the time. 27, 81 will come up 12.5% of time.

So, is it true that when you open a box with $27 in it, having one of [9, 27] is twice as likely as having one of [27, 81]?

Yes, if you were allowed to play the game and ignore every instance other than when you opened the box and found $27, the strategy of always switching would be best.

For the record, I am still not convinced which is the right answer, I am stuck between deciding if "always switch with 3, keep otherwise" or "always switch with 3 and some other specific numbers as well, keep otherwise"... However I am (fairly) convinced that "always switch" is not the right answer and will leave you with a net gain of switching of zero.

Title: Re: Boxing Clever
Post by Leon on Jun 29th, 2004, 7:44pm

on 06/29/04 at 18:56:30, pedronunezmd wrote:
Yes, if you were allowed to play the game and ignore every instance other than when you opened the box and found $27, the strategy of always switching would be best.


Since each time (if T&B will even let you play more than once) is independent, why is playing it once and finding $27 any different than playing it again and finding $81? or $9? or $243? or $59,049*?

If you switch on $27, then why not always switch?

* Personal utility of $59K (i.e. Boxster) aside of course.

Title: Re: Boxing Clever
Post by rmsgrey on Jun 30th, 2004, 6:51am

on 06/29/04 at 17:53:21, pedronunezmd wrote:
In that case, you don't even need to run the computer simulation for "always switching" either. It should be obvious that in an infinite number of runs, each starting from a fresh coin flip...

Calculate the gain from switching...

The combination (3,9) will come up exactly 1/2 of the time. Then pick a box and open it.
 1. Half of these will be 3, switch to 9, gain is +6.
 2. Half of these will be 9, switch to 3, gain is -6.
for a total of +0 for this combination whenever it occurs.

The combination (9,27) will come up exactly 1/4 of the time. Then pick a box and open it.
 1. Half of these will be 9, switch to 27, gain is +18.
 2. Half of these will be 27, switch to 9, gain is -18.
for a total of +0 for this combination whenever it occurs.

The combination (27,81) will come up exactly 1/8 of the time. Then pick a box and open it.
 1. Half of these will be 27, switch to 81, gain is +54.
 2. Half of these will be 81, switch to 27, gain is -54.
for a total of +0 for this combination whenever it occurs.

...etc, etc, etc, so that total gain from always switching for all subsets of combinations will combine to zero.


I calculate my returns based on the following:

1/4 I open a 3 (from 3,9) - return +6

3/8 I open a 9
from (3,9) 2/8 - return -6
from (9,27) 1/8 - return +18
net +6

3/16 I open a 27
from (9,27) 2/16 - return -18
from (27,81) 1/16 - return +54
net +18

3/32 I open an 81
from (27,81) 2/32 - return -54
from (81,243) 1/32 - return +162
net +54

etc

Since each number I open gives me a positive return from switching, my return from always switching must be positive.

As a quick comparison, (1-1)+(1-1)+... = 0 [ne] 1 = 1+(-1+1)+(-1+1)+...  What we appear to be doing is summing an infinite series which doesn't converge absolutely - while it's fun combining the terms in different clusters, the different arrangements aren't equivalent, so the real question is which is the correct evaluation of the actual situation...

Title: Re: Boxing Clever
Post by pedronunezmd on Jun 30th, 2004, 7:20pm
I have about 3 pages of notes where I have been trying to determine why there appears to be the two approaches to calculating the expected gain from switching, including the 2 series' sums as recently pointed out. The problem with your method, rmsgrey, is that you are combining the expected gains of 2 different "9"s, and 2 different "27"s, etc... I still can't explain it, but there cannot be 2 different answers to this riddle, I can't for certain prove that either way is the right way to look at it, so it will take someone a lot better at math than me to prove which way is right.

For the computer simulations out there: you might want to check something out. I started looking at my "random" distributions somewhat. If n is the number of coin flips until you get heads, try generating say 109 trials of n and look at the distribution... I get the correct expected numbers of n (such as 1/2 of all trials being n=1, 1/4 of all trials being n=2, etc) all the way to about n=16, then the distribution starts looking very funky, for example the number of n=20, n=21, and n=22 all are equal, and the number of n=23, n=24, and n=25 were all equal, and no matter how many trials I run, I can never get beyond n=25 even though by probability I should have had a near 100% chance at that point of having at least 1 n=26...

Title: Re: Boxing Clever
Post by rmsgrey on Jul 1st, 2004, 5:15am

on 06/30/04 at 19:20:29, pedronunezmd wrote:
I have about 3 pages of notes where I have been trying to determine why there appears to be the two approaches to calculating the expected gain from switching, including the 2 series' sums as recently pointed out. The problem with your method, rmsgrey, is that you are combining the expected gains of 2 different "9"s, and 2 different "27"s, etc... I still can't explain it, but there cannot be 2 different answers to this riddle, I can't for certain prove that either way is the right way to look at it, so it will take someone a lot better at math than me to prove which way is right.

Having opened a box, I know that I'm looking at a 9 or a 27 or whatever, but can't tell the two types of 9 apart or the two types of 27. Your sum, on the other hand combines terms from two situations I can tell apart, but splits situations I can't distinguish with available information.

[e]
The question seems to be whether you should base your calculations on the situation before you open the box (in which case you gain nothing by always switching) or the situation after you open the box (in which case you expect to gain by switching) If you follow SWF's analysis above, then always switching for values up to a given value will gain under either analysis, and gain more the larger your upper limit becomes...
[/e]

Title: Re: Boxing Clever
Post by THUDandBLUNDER on Jul 1st, 2004, 8:42am

Quote:
I get the correct expected numbers of n (such as 1/2 of all trials being n=1, 1/4 of all trials being n=2, etc) all the way to about n=16, then the distribution starts looking very funky, for example the number of n=20, n=21, and n=22 all are equal,

What is the correct expected number? What distribution are you using?
For large n (eg 109) and small p (eg 2-20) I think you should use a Poisson distribution.


Title: Re: Boxing Clever
Post by pedronunezmd on Jul 1st, 2004, 10:17am

on 07/01/04 at 08:42:09, THUDandBLUNDER wrote:
What is the correct expected number? What distribution are you using? For large n (eg 109) and small p (eg 2-20) you should use a Poisson distribution.

I would assume that, for example, the number of times n=22 comes up would be twice the number of times n=23 comes up. Is that not correct?

Title: Re: Boxing Clever
Post by pedronunezmd on Jul 1st, 2004, 10:31am

on 07/01/04 at 05:15:22, rmsgrey wrote:
The question seems to be whether you should base your calculations on the situation before you open the box (in which case you gain nothing by always switching) or the situation after you open the box (in which case you expect to gain by switching). If you follow SWF's analysis above, then always switching for values up to a given value will gain under either analysis, and gain more the larger your upper limit becomes...

If you are allowed to play the game an infinite number of times (or an extremely large number of times) then there is a gain to "switching up to a certain value and keeping after that value" although you might actually gain more, for example, by switching every other number up to a certain value.

However, I (still) don't see how "always" switching will lead to a positive gain if played an infinite number of times starting with the coin flips each time. This should be logical and obvious from an intuitive sense but still lacks a mathematical proof that this is true, at least as far as this thread has shown.

Title: Re: Boxing Clever
Post by rmsgrey on Jul 1st, 2004, 11:13am
I don't think there's any problems or arguments about the mathematics involved. Rather the two arguments arise from different ways of modelling the problem, so the question is not which argument is correct, but which model most accurately represents the situation described.

How about a finite version? Toss the coin as described, but if it comes up tails twice, discard the run and begin again. Ignoring the discarded results, over 6 runs, you'd expect to get:
(3,9)
(3,9)
(3,9)
(3,9)
(9,27)
(9,27)

The first two cases can be ignored - they are played the same way in both strategies. The "always switch" method is for a game with no maximum, so in this version the last case (open the maximum value) wouldn't be switched either, so can be ignored. The only cases where the two arguments produce different strategies for this game are those where a 9 is opened - where switching loses 6 twice, but gains 18 the third time for a net gain of 6 over the 6 games.

If you extend the game to discard 3 tails rather than 2, then, over 14 games, 4 open a 3, 6 open a 9 (net gain 12), 3 open a 27 (net gain 18) and one opens an 81 (known maximum, so no loss). Total net gain of 30 over 14 games.

Similarly, if you imagine a known cut-off at, say, one million tails, provided you decline to switch if you open 31,000,001, you still expect to gain on all the other switches.

If you stop switching at any value less than the game's actual maximum, then you gain precisely as much over the "3-only" strategy as if you were playing the game with maximum at your cut-off point. So if you pick an arbitrary, large cut-off point for the infinite game, then you will have an expected gain over "3-only" from switching up to that point as though you were playing the finite game with that maximum.

Title: Re: Boxing Clever
Post by pedronunezmd on Jul 1st, 2004, 4:28pm
Well, you can sum the expectations as follows for the "always switch up to the point where you get to a set maximum"...

For the pair (3,9) always switching gives you gain of zero.
For the pair (9,27) always switching gives you gain of zero.
...
For the pair (320,321) only switch with the 320 but keep at the (arbitrarily chosen) 321 for a gain.
...
And all other pairs back to zero gain.

Therefore, you only get the "gain" realized when you reach the arbitrarily large n, and all other pairs of (n,3n) you make no gain.

Title: Re: Boxing Clever
Post by pedronunezmd on Jul 1st, 2004, 5:27pm
And by the way, in your "finite game" example just posted, the reason you are finding a gain from switching is because you are allowing yourself to get to "kee ÿÿÿðE-ñw7Ç`

Title: <ÿÿàÿÿü
Post by on

Title: €ÿÿþ @ 7È 7Ç`€ÿÿÿÿÿÿÿÿÿÿÿÿ6ø«Áÿÿ\`Tecomes 3[sup]a+1[/sup]) + .5 (3[sup]a+1[/sup] becomes 3[sup]a[/sup])
... which = .5 (+2 * 3[sup]a[/sup]) + .5 (-2 * 3[sup]a[/sup]) = 0
Therefore expected gain always switching = sum of zeros = 0.

Version 2. Incorrect version.
Expected gain always switching =  [sub]a=1[/sub][sum][sup]infinity[/sup] P(box opened = 3[sup]a[/sup]) * E (switching when opened box = 3[sup]a[/sup])
... where E (switching when opened box = 3[sup]a[/sup]) = 2/3 (3[sup]a[/sup] becomes 3[sup]a-1[/sup]) + 1/3 (3[sup]a[/sup] becomes 3[sup]a+1[/sup])
... (except for special case a=1 when E(switching) = +6 always)
... which = 2/3 ( -2/3 * 3[sup]a[/sup]) + 1/3 ( +2 * 3[sup]a[/sup]) = 2/9 * 3[sup]a[/sup]
Therefore expected gain is sum of infinite number of positive numbers > 0.

[edit]Anyone know which version is flawed?  ???[/edit]

Post by on pedronunezmd

Title: Re: Boxing Clever
Post by rmsgrey on Jul 6th, 2004, 5:26am
The reason for summing over the value opened rather than the number of coins flipped is that that's the point at which you choose your strategy. You're not choosing to switch or not based on the number of coins flipped, but on the contents of the opened box. So that's the value you condition on in the infinite sum.



Powered by YaBB 1 Gold - SP 1.4!
Forum software copyright © 2000-2004 Yet another Bulletin Board