Here is an idealized draw poker game whose solution is extremely interesting mathematically. (I made this up myself but would not be surprised if someone has anticipated me.)

2 players each draw a real number uniformly from the unit interval [0,1]. They can’t see each other’s numbers. Player 1 decides to either keep his number or throw it away and draw another. Player 2, after hearing which decision player 1 made; decides to keep his number or throw it away and draw another. Players then reveal their numbers, higher number wins.

What is the best strategy for each player, and how often does each player win? To put this another way, if the players are wagering $100 on the outcome of the game, what is the value of the extra information player 2 gets by hearing player 1’s decision?

Hint: if your answer is simple, then it’s wrong.

**(Added 11/6: solution now given in the Comments.)**

### Like this:

Like Loading...

*Related*

This can be solved using calculus, algebra, or game theory; a little strategic thinking about what sensible play ought to look like will go a long way in streamlining the work. An approximate numerical solution (with error estimates) is acceptable but an exact solution is possible.

Limited to just two draws, or an infinite number? Could Player 1 draw 100 times and Player 2 could keep his first draw?

Only one draw for each player. It’s an extremely simple game, the only asymmetry is that one of the players must make his draw decision first.

2 players each draw a real number uniformly from the unit interval [0,1]. They can’t see each other’s numbers. Player 1 decides to either keep his number or throw it away and draw another. Player 2, after hearing which decision player 1 made; decides to keep his number or throw it away and draw another. Players then reveal their numbers, higher number wins.

What is the best strategy for each player, and how often does each player win?

If player 1 has drawn, then player 2’s probability of winning is equal to whatever number he has; he should clearly draw when he has less than 0.5, and keep his number if it is greater than 0.5 (it doesn’t matter what he does with exactly 0.5 since this single point contributes 0 probability).

Thinking strategically, it makes sense for player 1 to draw if his number is below some critical value x, and otherwise to keep it; similarly, for player 2, assuming player 1 has kept his number, to draw in turn if he has below some critical value y, and otherwise to keep his number.

Note also that y must be at least as large as x, because if y<x then player 2 would be taking a sure loss by keeping numbers between y and x after player 1 doesn't draw.

Now we can calculate the probability of player 1 winning in terms of x and y, by breaking the initial deal down into cases (ignoring probability-0 boundary cases):

i) Player 1 has less than x, player 2 has less than 0.5: 1 draws, 2 draws, probability of 1 winning is 1/2, probability of this case occurring is x/2, total contribution to probability of player 1 winning is (1/2 * x/2) = x/4.

ii) Player 1 has less than x, player 2 has greater than 0.5: 1 draws, 2 doesn't, probability of 1 winning varies uniformly from 1/2 to 0 depending on 2's number so averages 1/4, probability of this case occuring is x/2, total contribution to probability of player 1 winning is (1/4 * x/2) = x/8.

iii) Player 1 has at least x, player 2 has less than y: 1 doesn't draw, 2 does, probability of 1 winning varies uniformly from x to 1 depending on 1's number so averages (1+x)/2, probability of this case occurring is (1-x)y, total contribution to the probability of player 1 winning is (1+x)(1-x)y/2 = (y – yx^2)/2

iv) Player 1 has at least x and less than y, player 2 has at least y: (recall we know y is at least x): 1 doesn't draw, 2 doesn't draw, player 2 wins, total contribution to the probability of player 1 winning is 0

v) Player 1 has at least y, player 2 has at least y: 1 doesn't draw, 2 doesn't draw, by symmetry probability of player 1 winning is 1/2, probability of this case occurring is (1-y)(1-y), total contribution to the probability of player 1 winning is (1-y)(1-y)/2 = (1 – 2y + (y^2))/2

So now we have a function of the two variables x and y representing the probability of player 1 winning:

p = x/4 + x/8 + y/2 – yx^2/2 + 1/2 – y + y^2/2 = 1/2 + 3x/8 – y/2 – yx^2/2 + y^2/2

Now we can use calculus. For a given value of x, y will have been chosen to minimize p, so we take the derivative of p with respect to y treating x as constant, and set it to 0. This gives the equation 0 = -1/2 – x^2/2 + y and we can solve for y in terms of x: y = (1 + x^2)/2.

Now we can substitute and simplify:

p = 1/2 + 3x/8 – (1 + x^2)/4 – (x^2 + x^4)/4 + (1 + 2x^2 + x^4)/8 = (-1/8)x^4 + (-1/4)x^2 + (3/8)x + (3/8)

Player 1 chooses x to maximize p so we again take a derivative and set it to 0 to solve for x:

0 = (-1/2)x^3 + (-1/2)x + 3/8 which simplifies nicely to the cubic equation

x^3 + x = 3/4.

Here we can solve numerically to get x = 0.567364226681 and p = 0.494333418. This is the probability of player 1 winning, so the probability of player 2 winning is 1-p = 0.505666582 and player 2's advantage is 0.011333164 of a unit wager. (Also, we get y = 0.66095108286)

In other words, if the players bet $100 on the game, the value of the extra information player 2 obtains by going 2nd is $1.13 — surprisingly small.

Another interesting point is that, although the final equation for x is very simple, the exact solution for the equation

x^3 + x = 3/4

is still impressively complicated:

x = cbrt(3/8 + sqrt(307/1728)) + cbrt(3/8 – sqrt(307/1728))

where sqrt is the square root symbol and cbrt is the cube root symbol.

We'll talk more about cubic equations when we solve the next math puzzle.