A Big Lottery
“Peter tosses a coin and continues to do so until it should land "heads" when it comes to the ground. He agrees to give Paul one ducat if he gets "heads" on the very first throw, two ducats if he gets it on the second, four if on the third, eight if on the fourth, and so on, so that with each additional throw the number of ducats he must pay is doubled. Suppose we seek to determine the value of Paul's expectation.”
This is how Daniel Bernoulli rephrased a problem nagging a tight circle of mathematicians almost 300 years ago. The issue is that, if you go by the mathematical expected value of the winnings you end up with an infinite value for the lottery: you have 1/2 probability of winning one ducat on the first throw, 1/4 probability of winning two ducats in 2 throws (which adds another 1/2 to the expected value), 1/8 probability of winning 4 ducats in 3 throws (adding another 1/2), and so on adding 1/2 a ducat in value an infinite number of times.
The expected value suggests that a player should be willing to pay any amount asked for a chance to play this lottery. Yet, in real life no one in their right sense would pay more than a few ducats to play this, perhaps somewhere around 5 ducats. Why is it that the mathematical expected value doesn’t reflect the real value placed by real players on this lottery?