Card Betting

Continuing on my plan of posting the puzzles I found in the paper "Games People Don't Play", I have a card flipping puzzle this time. Back when I started this blog I posted a card flipping problem that Bart had told me, and the puzzle this time is something of a continuation of that one:
There is a deck of 52 cards, 26 red and 26 black. You will be playing a game where you bet on what is the next card in the deck. You begin with $1 and can make a bet of any real number between zero and the amount of money you have on whether the next card in the deck is red or black. The top card of the deck is then revealed and if you were correct you gain the amount you bet, if you were incorrect you lose the amount you bet. Then the game continues with betting on the next card in the deck. The game ends when there are no cards remaining in the deck to bet on. What is the optimal betting strategy assuming that the deck is being arranged by an adversary?

I really like the sort of puzzle that seems flat out impossible to do anything at first, putting the player up against an adversary seems like a pretty insurmountable obstacle, but you still have some wiggle room.

Alice Wins?

I guess I should put up the solution to the second two numbers problem. The fact that I ask for an equilibrium strategy sort of gives something away I suppose, because in the first one there is no equilibrium strategy (Alice can make Bobs winning chance 1/2+ε for any positive ε so there is no optimal one). To begin, what is the strategy space for Bob? Bob sees a number, x, and must decide if that number is larger or smaller based simply on that number. All Bob can do is select a function f(x) to give him the probability that he says x is the larger number. The solution to the last puzzle strongly suggests that f(x) should be a strictly increasing function, but it is worth noting that if f(x) is simply 1/2 everywhere, then Bob is certain to get 50% on this game, so there is no way Alice can push him lower than that.

Suppose Alice is given f(x) (Bobs strategy) and is given the two numbers y and z from (0,1) (also, we may as well assume z>y). Now, if Alice chooses to show Bob y, Bob has a 1-f(y) chance to win, and if Alice chooses to show z, Bob has a f(z) chance to win. Thus Bobs chance to win (if Alice knows f(x)) is given as:
min(1-f(y),f(z))

Now we must integrate this over all space of y and z. The natural integral would be ∫ ∫ dy dz with both going over (0,1), but this does not work because we have already assumed z is the larger of the two numbers. We should simply restrict the integral to z ∈ (0,1) and y ∈ (0,z) and multiply by 2 to take care of the other half of the region where y happens to be larger. Bobs chance to win is given as
2 ∫ ∫ min(1-f(y),f(z)) dy dz

with y from 0 to z and z from 0 to 1.

The next thing to know is that Bob is trying to maximize this integral. We must select an f(x) to make this as large as possible. Remember we already know that with f(x)=1/2, we trivially get 1/2 for the answer, so really we need only ask if we can make it larger than 1/2. You can try some functions out for a while, but in the end you will see that nothing will integrate to more than 1/2.

We can prove that Bob cannot get better than 1/2 by using 2 features about the min and max functions:
1.min(a,b)+max(a,b)=a+b
2.min(a,b) ≤ max(a,b)

Now we define
P=2 ∫ ∫ min(1-f(y),f(z)) dy dz
Q=2 ∫ ∫ max(1-f(y),f(z)) dy dz

integrated over y from 0 to z and z from 0 to 1 as before. We seek to prove that P ≤ 1/2 for all choices of f.

Fact 2 from before shows us right away that P ≤ Q, and fact 1 gives us
P+Q=2 ∫ ∫ (1-f(y)+f(z)) dy dz
=2 (∫ ∫ 1 dy dz - ∫ ∫ f(y) dy dz + ∫ ∫ f(z) dy dz)

The first integral is simply 1/2 (the area of integration) and in the third integral you can do the y integral separately to get ∫ zf(z) dz integrated from 0 to 1. For the second integral we simply rewrite the integration region from y ∈ (0,z) and z ∈ (0,1) to z ∈ (y,1) and y ∈ (0,1) so we can do the z integral first to get ∫ (1-y)f(y) dy integrated from 0 to 1. We currently have
P+Q=1 - ∫ (1-y)f(y)dy + ∫ zf(z)dz

But using the substitution z=1-y the two integrals are trivially identical for any f(x). Therefore,
P+Q=1

Now since P ≤ Q we have (by adding P to both sides) 2P ≤ P+Q so 2P ≤ 1 therefore P ≤ 1/2 as we sought.

OK great, so if Alice knows Bobs f(x), there is no way Bob can choose an f(x) that guarantees to get any better than 1/2. So if there is an equilibrium strategy, it must be 50/50. But is there one? Can we construct a strategy for Alice that makes sure Bob cannot get better than 1/2 even if we have no idea what f(x) he chose? We can, and its actually what one would instinctively do if one were playing this game.

Consider you are Alice and you are handed the numbers 0.1 and 0.5, which one would you show Bob? The obvious choice is 0.5, 0.1 looks too obviously the low one. How about 0.8 and 0.4? Less obvious, but it would seem the rational guess is 0.4. How about 0.7 and 0.4? again 0.4, 0.7 has less space above it than 0.4 has below it. How about 0.8 and 0.6? 0.6 is pretty clearly better. How about 0.7 and 0.3? now it seems they are symmetric, no good way to choose. How about 0.6 and 0.4? again symmetric.

See what the strategy is? Just show Bob whichever number happens to be closest to 0.5. Now, if we obey this strategy, what is Bobs chance of winning? First let us rescale the numbers down by 0.5 so that the two numbers, y and z, range from -1/2 to 1/2 and we show Bob whichever one is closer to zero.

Consider the y-z plane in the box y ∈ (-1/2,1/2) z ∈ (-1/2,1/2). The numbers we get select a point in this region. Which number is y and which is z doesn't matter here, select which is which at random.

Divide up the region into 4 pieces using the lines y=z (from bottom left to top right) and the line y=-z (from top left to bottom right). The 4 regions are "top" (where z>0 and y is between -z and z) "bottom" (where z<0 -y="" -z="" and="" between="" is="" left="" right="" where="" y="" z="">0 and z is between -y and y). If our two numbers put the point in either "top" or "bottom" we show Bob y. If the point is in "left" or "right", we show Bob z.

Suppose we have a point in "top". We show Bob y and he has to guess that y is the smaller number. The chance of him doing so is 1-f(y). For all points in "top", we would integrate this to get the total probability of Bob winning, given a point in "top". That is, when the point lands in "top", Bobs chance of winning is ∫ 1-f(y) integrated over "top". In "bottom" we show Bob y and he must guess it is the larger number. He does this with probability f(y) so, 1/4 of the time, Bobs chance of winning is ∫ f(y) integrated over "bottom". Note that ∫ f(y) in "top" is the same as ∫ f(y) in "bottom", since they both have the same y values.

Using the same argument for "left" and "right" we get Bobs chance of winning is (∫ 1-f(y) over "top")+(∫ f(y) over "bottom")+(∫ 1-f(z) over "right")+(∫ f(z) over "bottom"). The integrals of unknown functions all cancel, leaving ∫ 1 over "top"+∫ 1 over "bottom" giving 1/4 each (their areas) for a total of 1/2.

Simply put, if Alice shows Bob 0.3 (say), and Bob knows this number is the one that is closer to 1/2, then the other number is equally likely to be in (0,0.3) or (0.7,1) and Bob cannot tell those apart.

Thus if Alice follows the strategy "show Bob whichever number is closer to 1/2", then Bob cannot do better than 1/2 no matter what f(x) he chooses. If Bob uses the strategy f(x)=1/2 (so, "flip a coin") Bob cannot do worse than 1/2 no matter how Alice chooses the number to show. So this gives the equilibrium strategy (or, at least, an equilibrium strategy).

I find it funny that if you do this game with real people, the Alice player will probably typically hit on the correct strategy naively, even though they will then likely try to follow it up by "out-thinking" the other player, because thats just what humans do.

Two More Numbers

Alright, now that the two numbers puzzle is out of the way, time for the follow-up. I first learned this puzzle in a paper called "Games People Don't Play" which I found on the comments at the xkcd blag:
Two numbers are selected uniformly at random from the interval (0,1) and shown to Alice. Alice must select one of these two numbers and show it to Bob. Bob must then guess if Alice has shown him the larger of the two or the smaller of the two. What is the optimal (equilibrium) strategy for the players?

This time Alice does not get control of the two numbers, and Bob is aware of the distribution, but Alice gets to choose what number Bob sees, so who gains the advantage from those changes?

Only A Little

Time for the solution to the Two Numbers problem. I suppose I could just state the solution right away, but I like to do something of a derivation first.

First of all, consider the results of the following strategy for Bob:
If the number you see is greater than zero, guess it is larger, if the number is less than zero, guess it is smaller

Easy enough to implement, if both numbers that Alice wrote down are positive, then you are still 50/50. If they are both negative then you are again 50/50, but if one number was positive and one was negative, then you are guaranteed to win.

Alright, thats all well and good, naturally if Alice knows you are doing this, then she won't ever write down a positive and negative number together of course, so this is not actually guaranteed to give you a better than 50% chance of winning the game. Now, instead of greater than/less than zero, why not use a random number K. Perform the following strategy:
Select a nonzero probability distribution over the reals f(z). After Alice writes down the numbers, select a number K using f(z). If the number you see is greater than K guess it is the larger one, if the number you see is smaller than K then guess it is the smaller one.

By nonzero probability distribution I mean that f(z)>0 always and ∫f(z)dz = 1 when integrated over all the reals. Basically, the strategy is to assume that the unseen number is near K.

Suppose Alice writes down x and y with y>x, then if x>K or K>y you are simply 50/50, but if y>K>x you win the game 100%. The chance of you winning is therefore
∫f(z)dz+(1-∫f(z)dz)1/2
=1/2+1/2∫f(z)dz

Where the integral is between x and y (giving the chance K lands between x and y). Since f(z)>0 always, this is greater than 1/2 in all cases and is a valid solution. Naturally, if Alice knows f(z), she can make that integral as small as she likes, but she cannot make it non positive.

One can make a slightly more general (but not actually any better) solution as well. Consider that Bob looks at the number before choosing K. Then the chance that K will be less than x (and thus guess x is larger) is ∫f(z)dz integrated from -∞ to x. Let p(x) be that number (the probability that we say x is larger). Clearly we have p(x) positive, increasing, and 0 at -∞ and 1 at +∞.

Let Alice choose number x and y with y>x. Now the chance Bob wins given some function p(x) is 1/2 (the chance he is given y) times p(y) (the chance he says y is larger) plus 1/2 (the chance he is given x) times 1-p(x) (the chance he says x is smaller). thus giving
1/2 p(y)+1/2(1-p(x))
=1/2+1/2(p(y)-p(x))

Not surprising given that p is the integral of f from before.

Anyway, the only thing that we need to demand of p(x) is that is be strictly increasing and bounded between 0 and 1. It does not matter that it goes to 0 at -∞ or 1 at +∞ (though, I guess it might as well). So the strategy of saying the number you see, x, is greater with probability p(x) is more general than the strategy of picking K from f(x) (though not any better, even if it is more general). Again, given p(x) Alice has the power to make p(y)-p(x) as small as she wants, but she can never stop it from being positive.

Two Numbers

So, about a year ago, I posted a puzzle based on the two envelope paradox. I had intended to finish that one off with another puzzle, but then I guess I forgot to. Now I have learned about a follow up to that puzzle, but I can't exactly post i until I have posted the original. Anyway, here we go:
Alice writes down two distinct real numbers and puts them into two separate envelopes. Bob then selects one of the envelopes to open randomly (50-50 chance) and looks at the number. Bob must then guess whether the number he is looking at is the larger of the two or the smaller of the two. Find a strategy for Bob that is guaranteed to succeed more than 50% of the time, no matter what numbers Alice chooses.

Its very similar to the final problem I posted a year ago, so I'll give the solution out in the next few days so we can move on to the real problem. Its trivial for Bob to do 50% simply by flipping a coin to decide his answer, but doing better just seems crazy. If you need, feel free to assume Alice is choosing her numbers from an unknown but well defined distribution over the reals.

Two Answers

The puzzle of the three princesses has been up far too long for how simple it is, I suppose its time for the solution.

Its actually quite easy, first number the princesses 1, 2, and 3 and then ask princess number 1:
Is princess 2 older than princess 3?

Then, based on the answer you get
yes: marry princess 3
no: marry princess 2

Easy enough, if princess 1 was the truth teller, then you aim to marry the younger one. If princess 1 was the liar, you aim to marry the older one. If princess 1 was the middle sister, then it does not matter who you marry, as long as it is not her.

I suppose that it was a bit of a lie when I said this is not a meta-question, since asking about the "truthiness" of other sisters is something of a meta-question, and asking about their ages is equivalent to asking about how much they tell the truth in this problem. I don't know though, the answer seems elegant enough, and its hardly going to count as a trick or "communicating badly" to have an answer involve their ages when it is said so clearly in the problem.

Anyway, yea, thats the answer as I know it, if you have any other possible answers feel free to say them in the comments.