Talk:St. Petersburg paradox
From Wikipedia, the free encyclopedia
Contents |
[edit] flawed discussion?
The discussion is flawed. Here an example, using non-infinite reasoning. A mail with similar content was sent to Mr.Martin, the author of the Stanford page.
Consider: You get a bet to either
- Win 2 billion with probability 1/1000 or
- Nothing with prob. 999/1000.
The bet costs 1 million.
Would you take the risk? Answer is: no, since in 99,9% of cases you end up (more than) broke. If you didn't inherit a lot, this seems unattractive.
OTOH, if you are a big insurer with the pockets to buy 10 000 of these contracts? Then the answer is yes, since your probability of getting less than their 10 billion back is 0,029196 (binomial distribution with less than 5 hits out of 10 000 tries with prob. 0.001). If 20 billion are invested, that probability falls to 0,004979 (bin(9,20 000,0.001)).
In short, risk aversion flattens when averaged over a lot of people. The only difference in the original St. Petersburg Paradox is, that it flattens a lot slower than this example. The following happens to the dist. if restricted to 2^1-2^60:
n $ Prob. =============================== 1 2 0,5 2 4 0,25 3 8 0,125 4 16 0,0625 5 32 0,03125 6 64 0,015625 7 128 0,0078125 8 256 0,00390625 9 512 0,001953125 10 1024 0,000976563 11 2048 0,000488281 ... 56 7,20576E+16 1,38778E-17 57 1,44115E+17 6,93889E-18 58 2,8823E+17 3,46945E-18 59 5,76461E+17 1,73472E-18 60 1,15292E+18 8,67362E-19
Here, the stdev is 1518500250 (1.5 billion) with a mean of 60. In order to average that stdev out, a lot of contracts need to be sold.
(Assuming a 2-stdev distance from $25, a stdev of $12, and the minimum convergence speed in the Central Limit Theorem of stdev/n^0.5, a total of 3,32E+20 bets would be needed.)
- Interesting. I don't know enough maths to comment further :). If the article misstates the paradox or any of the common proposed resolutions, it ought to be reformulated. However, Wikipedia is not an academic journal, so if you're making a previously unmade contribution to the discussion (in the real-world, not Wikipedia) of the problem, then this is not the place to do it. Publish it somewhere reputable and then update this article accordingly, with a reference to the outside source. Joestynes 11:10, 20 Apr 2005 (UTC)
-
- "In short, risk aversion flattens when averaged over a lot of people." -- I don't think that this is actually new, nor does it contradict the article as written. Playing a game 10,000 times at once (as either 10,000 different subscribers or one player playing repeatedly) yields a substantially different and less risky "supergame". A lot of contracts being sold is simply repeated play of the game... Risk aversion doesn't flatten, what flattens is risk itself. Amortization/insurance/repeated-play can be used to reduce risk (reduce variation). --Brokenfixer 00:29, 16 January 2006 (UTC)
-
- In addition: notice that selling 10,000 shares to different subscribers not only reduces risk, it also improves the expected utility of the game as a separate effect. Even if you play the game only once, by splitting the cost of the game and its proceeds over 10,000 friends, you protect against the diminishing marginal utility of the money. 10,000 chances to become a millionaire is more attractive than 1 chance to become a 10-billionaire. --Brokenfixer 00:29, 16 January 2006 (UTC)
-
- However, since the expected mean value of the St. Petersburg game is infinite, it doesn't matter to the expected mean value whether you set the starting heads payoff at $0.01 or at $10,000. This is an apparent paradox. It can be cleared up by realizing that (1) the game cannot be realized in real-world terms (because no real-world agent can commit to a potential $10^google payout), or that (2) the utility of money is non-linear (and the expected utility of the game is finite), or that (3) rational economic decision theory must take into account inherent costs associated with volatility (risk). A factory that produces 10,000 widgets per day is more profitable than a factory that produces 310,000 widgets on some single random day each month, since the variable factory must pay much higher contracts for storage, delivery, sales, and so on. (In my opinion, all 3 of these resolutions have empirical support.) --Brokenfixer 00:29, 16 January 2006 (UTC)
[edit] why is it a "paradox"?
One aspect of the paradox that is simply common sense and that may deserve some mention, is a fundamental reason this situation seems like a paradox.
The average value of the game converges to infinity as the number of gameplays goes to infinity.
In simple, common-sense thinking infinity equals "really big."
But the value converges to infinity at a very, very slow pace. In practical terms, that means that the typical value of a game is going to be very, very low for many, many game plays.
Therefore much of the paradox revolves around the contrast between the calculated average value of a gameplay (infinite) and the typical average value of trying the game a few times (a few pennies).
In short, it's a paradox of infinity--what we expect infinity to be (big!) and what it actual is (a continual rising without any upper bound, which can be fast or slow but in this case is very, very slow).
If, for instance, we were to state the paradox in terms of millions of dollars rather than pennies it wouldn't seem so paradoxical (the paradox isn't always stated in terms of "pennies" but it always seems to be stated in terms of some very small unit of currency--seemingly to emphasize the difference between the amount actually won after a few plays, small, and the average value of a play, infinite).
And if the value of the game grew at a much faster pace then the result would not seem so paradoxical. For instance, if a game were designed where the player typically won on the order of $10 after ten plays, $1000 after 20 plays, $100,000 after 30 plays, and so on, growing exponentially, then there would be no sense of paradox. Getting really big really fast is common sense for "approaching infinity". But the St. Petersburg game gets really big only at a very, very slow pace.
Putting all this another way, common sense says that if the average value of a game is infinite, that must mean that a person who plays it a few times is going to win a whole lot of money, because infinity is "really big".
What the mathematics behind the statement "average value is infinite" really means in plain English, though, is simply this: with more and more game plays, the average value of each game play will rise without any upper bound. So it's perfectly reasonable for it to stay small for a long, long time (over many thousands of gameplays) and in fact that is exactly what it does.
A simple computer simulation of the St. Petersburg Paradox shows that over thousands, tens of thousands, and then millions of simulated plays, the average value of each play grows and grows--it grows slowly but inexorably. It grows suddenly by a large amount, then declines slowly, before growing again by a slightly larger amount. What happens, again in layman's terms, is that over many thousands and then millions of plays the "rare" events that have a huge payoff begin to happen with some regularity, and when they do the payoff is so huge that they greatly increase the overall average value.
In sum, much of the paradox lies between what we naively think "infinite average value of a gameplay" ought to mean and what it really does mean.
It might be helpful to show a graph of average value over several thousand or even million gameplays, which would demonstrate both that the average value does rise inexorably and also that it rises slowly. This seems to be a key to the paradox.
Bhugh 23:34, 29 September 2005 (UTC)
- "The average value of the game converges to infinity as the number of gameplays goes to infinity." This is wrong. Mathematically, the AVERAGE expected value of ONE PLAY of the Game of St. Petersburg is INFINITE. The average, as in the arithmetic mean, as in the expected value, as in your expected per-game return, never deviates. It doesn't go up, it doesn't go down. As you accurately describe, what changes is the typical outcome of the game as compared to the typical outcome from 500 or 20,000 repeated plays of the game averaged together. The median payout of "one game of St. Petersburg" is very small, while the median payout of "20,000 games of St. Petersburg averaged together" is much larger. The average (mean) payouts of those two are both exactly the same (infinite). The St. Petersburg graphic is a bit deceptive in this regard. Since it just shows one sample run, the early points on the graph can't claim to accurately represent any sort of true theoretical average. If you print out a million of those graphics, it is likely that some of them will start out with a huge spike (in the first 50 or so games), and spend the next 20,000 games (the entire rest of the graph) marching downward. The graphic is a typical or possible outcome of 1,2,3,...,20000 games -- it does not display the true theoretical mean. Rather, the graphic shows (and the article ought to describe) that the typical (e.g. likely or median) payoff of St. Petersburg is very small ($1.50), whereas the expected (e.g. average or mean) payoff is infinite. The median payout of a single game of St. Petersburg is $1.50 (half of your games pay $1, half pay $2 or more). The median payout of a 2-game-averaged-cluster of St. Petersburg is $1.75 (one-half pay $1.50 or less, one-half pay $2.00 or more average per subgame). The 3-game median payout is $2.00. The TYPICAL (median) value of the supergame (obtained by averaging a cluster of multiple games of St. Petersburg) climbs slowly and inexorably to infinity as the number of subgames in your supergame increases. --Brokenfixer 00:29, 16 January 2006 (UTC) (clarified) --Brokenfixer 00:51, 16 January 2006 (UTC)
-
- Original statement: "The average value of the game converges to infinity as the number of gameplays goes to infinity."
- Reply: "This is wrong. Mathematically, the AVERAGE expected value of ONE PLAY of the Game of St. Petersburg is INFINITE."
- Well, you say "This is wrong" but in fact those two statements are not contradictory. The fact that the "average expected value" has some certain value does not mean that you are going to hit that value every time you play the game.
-
- This is true for the Petersburg game but also other games as well. If the average expected value for the game is $1 it doesn't mean that you are going to win $1 every time. Rather you are going win more than $1 sometimes and less than $1 sometimes. After, say, 3 plays it is very unlikely that your actual average value will be $1. It will almost certainly be more or less than $1. After 30 plays your average will (probably) be closer to $1, and 300 plays it will (probably) be closer yet, closer yet with 3000, closer yet with 3 million, and so on. In short, "The average value of the game converges to $1 as the number of gameplays increases."
-
- This is just an informal way of explaining what we know of as the law of large numbers.
-
- Translating this into terms of the Petersburg Paradox: The fact that the average expected value of the game is "infinite" does not mean that we will win an infinitely large prize each time we play. Rather it means that the more we play, the higher our average winnings will become. There is no upper limit to the average winnings--they will continue to increase indefinitely and without any upper bound as the number of plays increases.
-
- This is, in fact, exactly what happens when the game is played.
-
- And in fact it is no paradox at all if you understand what is meant by the term "average expected value" in light of the law of large numbers.
-
- What does make the situation seem paradoxical is that informally we think of "average" as something you would sometimes hit below, sometimes above, sometimes right on. The idea that we can approach the average always from below and always "way" below (because no number, no matter how large, is close to infinity!) is not in accord with our everyday experience.
-
- The other aspect that seems paradoxical is the fact that, as pointed out in the main article, the value of the game plays grows so slowly. Naively we expect a game with "infinite" average value to pay out a lot of money fast and for average earnings to grow fast. But there are a lot of other ways to get to infinity. One of them is very slowly and unsteadily and that is the way the St. Peterburg lottery gets there.
-
- If players typically won a million dollars after 10 plays, a billion dollars after 20 plays, and so on, we wouldn't see any paradox. But if typical winnings are $5 after 1,000 plays, $6 after 1 million, $7 after a trillion, and so on--well, the average expected value is still infinite but we're getting there a whole lot slower.
-
- And that makes it seem paradoxical.
- A couple paragraphs earlier, I argue that the key to the paradox is that (1) the game cannot be realized in real-world terms (because no real-world agent can commit to a potential $10^google payout), (2) the expected utility of money is very non-linear, and (3) the naive treatment does not in any way account for the outlandish (indeed infinite) risk of this game -- yet according to modern portfolio theory, risk has demonstrable cost (a real-world high-risk high-expected-value investment can have equal value to a low-risk low-expected-value investment). St. Petersburg's naive mathematical treatment of $100,000,000,000,000 as simply 100,000 copies of $1,000,000,000 is inappropriate. --Brokenfixer 00:29, 16 January 2006 (UTC)
-
- The statement that the expected value of the St. Petersburg game is mathematically infinite is highly misleading. The accurate statement is that the expected value of the game grows logarithmically as the bankroll of the opponent. To understand the difference, consider a different problem. Lets say a certain casino introduces a new novelty game called "Leningrad." You analyze the game and find a highly non-obvious strategy that gives the player a 2% edge over the house. (I'm told casinos occasionally do make such mistakes.) Let's say further that the local gaming commission requires casinos that introduce new games to keep them operating and not limit bets or bar players. You propose to sell your secret. What's it worth? A naive analysis would say it's worth is infinite. A more refined analysis says it is worth whatever the casino is worth, less one's expenses in playing the game (travel, room and board, bodyguards). Not much different from the St. Petersburg game, right? Wrong! The value of the Leningrad game grows pretty much linearly as the value of the bankroll (the casino company) grows. If the value of the casino doubles, the value of the Leningrad game secret doubles. If the bankroll of the St. Petersburg opponent doubles, however, the value of that game only goes up by 50 cents, or one half the per-toss bet. That is what logarithmic growth means. It's the antithesis of infinite growth.--agr 21:48, 16 January 2006 (UTC)
-
-
- The expected value of the Saint Petersburg game is infinite. That's the whole point. The Saint Petersburg game is used in the context of game theory, analyzed as an abstract or theoretical game. The infinite expected value illustrates that in real-world applications, more sophisticated real-world hypotheses and methods are necessary -- such as taking into consideration the limited bankroll of your opponent. Saint Petersburg game, as written and as used, implicitly assumes an infinite bankroll, which demonstrates the need for bankroll analysis. --Brokenfixer 20:23, 17 January 2006 (UTC)
-
-
- If the St. Petersburg game was merely being used a a cautionary tale to beware of assumptions like an infinite bankroll, I would have no problem. The martingale roulette system is often used in this way as a teaching tool. But there is a vast literature that seems to take the notion of an infinite expected value as the correct interpretation of the game and then seek other reasons why no one seems willing to pay large sums to enter it. Scholars who I'm sure would dismiss the martingale system as rubbish seem to insist that the St. Petersburg paradox presents deep difficulties. --agr 02:55, 27 January 2006 (UTC)
-
-
- The "Leningrad secret" needs to be converted into a game before comparison can be made. If you convert Saint Petersburg game into a "secret" (the "Saint Petersburg secret" which is the ability to play the St. Petersburg game against a casino), then the Saint Petersburg secret is worth exactly the same amount as the Leningrad secret -- namely the entire bankroll of the casino that offers the game. It doesn't matter whether the Casino offers Leningrad or St. Petersburg -- they go bankrupt either way. Repeated play of any positive-expectation no-loss-risk game will return the entire opponent's bankroll. A single event of the Leningrad game appears to be "you get $1.02 for each $1.00 of your bet, with no house limit on your bet." As written, though, Leningrad is limited by your own bankroll - so if you only had $1 it would take months (working 8-hours/day 5-days/week) to bankrupt a casino. How about making a variant of the Leningrad game, the Serf Leningrad game, where you are allowed to bet as much as you want even if you aren't good for it. The Serf Leningrad game can be restated as: "Flip a coin. Ignore the result and win your opponent's entire bankroll." Meanwhile, consider the Petrograd game: "Flip a coin. Ignore the result and win the logarithm of your opponent's bankroll." Your remarks show that limited-bankroll Saint Petersburg is a randomized variant of the Petrograd game as opposed to the Leningrad game. --Brokenfixer 20:23, 17 January 2006 (UTC)
-
-
- Flip a coin and win the casino's bankroll is vastly different from flip a coin and win the log of the casino's bankroll. Both are infinite for an infinite bankroll, but the former grows large as the bankroll grows large, while the latter never does for any conceivable bankroll in the universe as we know it. --agr 02:55, 27 January 2006 (UTC)
[edit] Change to dollars?
I added a section with a more careful math analysis. I'd also like to change the description of the game to dollar bets instead of cent bets. I believe it is hard for readers to think about what they might actually pay to enter the game when it is presented in cents. People routinely pay 50 cents to play an arcade game just for the entertainment value. Any objections? --agr 11:47, 19 December 2005 (UTC)
[edit] More about utility
Since the St Petersburg paradox was central to the devolpment of the idea of utility functions, we should elaborate on that a little more, rather than on the not so interesting practical restrictions (okay, nobody has infinite money, but if the theorists had stopped there, the economic decision theory would have stopped as well, wouldn't it?) If I have time, I write a little more about this. Marc 129.132.146.72 13:32, 23 January 2006 (UTC)
I just skimmed through the Stanford page we are linking, and I must say that some of the statements there are at best controversal.
"If someone prefers $1 worth of birds in hand to any value of birds in the bush, then that person needs psychiatric help; this is not a rational decision strategy."
This is kind of their typical style of arguments to dismiss any potential resolution of the paradox. To me it looks like they are really keen of keeping it an unsolvable paradox rather than a paradox for a naive approach to decision theory that has lead to important developments later.
In short, I think we can and should easily do better than them...
Marc 129.132.146.72 13:45, 23 January 2006 (UTC)
- The St Petersburg paradox is certainly of historical importance in the development of utility theory, but it is completely fallacious. There is nothing paradoxical about the unwillingness of people to pay large amounts of money to enter a game that their instincts tell them isn't worth much when those instincts are correct. If the "theorists" (including Bernoulli who should have know better) had stopped there, better examples which actually illustrate the effect would no doubt have been developed. What is most interesting to me about the St Petersburg paradox is that so many economics and philosophy professors still teach their gullible students that the game's value is "theoretically infinite." It is a better illustration of a meme than of utility theory.--agr 14:19, 23 January 2006 (UTC)
Slow down, save a life! ;-) We agree, I think, that the St.Petersburg paradox is the starting point of utility theory. (If there are better examples for utilit theory, then they should be under utility theory, naturally, so this does not matter here.) A paradox is (in this context) simply a clash between a well-established model and real life. Of course real life describes real life correctly, but so what? Here it is about the (at that time) well-established idea that expected returns describe the value of a game and a (hypothetical!) situation that clashes with it. (theoretically infinite hence means "by applying the model that decisions are made and should made according to the expected return only", which of course is wrong, which we know - e.g. - by the St.Petersburg paradox!) The St. Petersburg paradox is hence historically important. If we want to have an article which is helpful for understanding its point, it neither helps to mumble over it in a mystifying and authorotative approach as in the Stanford link, but it also does not help to question whether the lottery is feasible for practical reasons. - By the way, our discussion here reflects in some parts the state of the art until the 1950s, before people started to understand the utility theory correctly. By now it is so well-established in every area of economics and the St.Petersburg paradox is so widely known that our article has to reflect this! Marc84.72.31.59 19:00, 23 January 2006 (UTC)
- I don't think you can rescue the claim that the expected value is infinite. it simply isn't. It's not a question of practicality. Under the most outlandishly favorable assumptions, the expected value never gets at all big. The most one can say accurately is that many scholars thought it was infinite and developed theories of utility based on the apparent paradox and that those theories are still considered important, based on other evidence. --agr 00:42, 25 January 2006 (UTC)
-
- "What is most interesting to me about the St Petersburg paradox is that so many economics and philosophy professors still teach their gullible students that the game's value is 'theoretically infinite.'" The reason that modern economics and philosophy professors teach that the expected value of the Saint Petersburg Game is infinite is because the expected value of the Saint Petersburg Game is infinite. That was its original design purpose. Your arguments are in support of a contention that the original Saint Petersburg Game only has philosophical and historical relevance rather than current real-world economic implications. For example, this might imply that the underlying theoretical assumptions of the Saint Petersburg Game are so outlandish that they cannot be realized in any real-world terms. --Brokenfixer 17:37, 25 January 2006 (UTC)
-
- The original Saint Petersburg Game itself predates modern utility theory, and uses the word money to denote payouts. Saint Petersburg justifies and motivates modern utility theory, which provides a framework to convert money into utility in a non-linear fashion. Lo-and-behold, assuming non-linear utility of money, Saint Petersburg with "infinite" monetary payoffs yields a finite expected _utility_. But while the economists are busy popping champagne corks, philosophers point out that additional definitions utterly failed to address the underlying paradox. In modern economic terms, the Saint Petersburg Game is easily rephrased to pay out in utility units (linear by construction) instead of dollars. Thus if $1 is worth 1 hedon to you, but $1,024 is only worth 200 hedons to you, then the Saint Petersburg banker would pay you 1,024 hedons (which might be $500,000 or whatever). In Bernoulli's formulation of the Saint Petersburg Game, utility=money; in modern analysis, a Saint Petersburg Game should be declared to pay out in utility units instead of money. A second issue, the discussion of limited/unlimited bankroll, is swept aside by making the banker a visiting space alien doling out hedons, or having God act as banker. Next, philosophers scramble to argue for or against a universal absolute upper bound (maximum limit) on utility. Anyway, these issues are currently addressed in modern textbooks - Implying that the modern scientific community misinforms a gullible populace should be published as Original Research. --Brokenfixer 17:37, 25 January 2006 (UTC)
- The Saint Petersburg game has to stop when the opponent's assets can no longer cover the bet. If that isn't obvious to economists and philosophers, it is certainly obvious to the ordinary folk whose unwillingness to pay large sums to enter the game is the nub of the supposed paradox. Because the jackpot grows exponentially, it quickly becomes bigger than the total assets of any conceivable opponent—in just a few dozen plays. The value of the game is half the initial bet times the number of plays. Since the number of possible plays is finite, the value of the game is finite. In fact it only grows logarithmically with the assets of the opponent.
- I'd love to think this was original research, but it is pretty old and standard math. The Saint Petersburg game is just a variation on the famous Martingale (roulette system) betting strategy: Bet a dollar on red, if you lose double your bet. Keep doing this until you win. The analysis is essentially the same: your expected winning for each play is half the initial bet and the game can't go on forever because you will exhaust your bankroll if there is a long enough run of black.
- I've edited the mathematical analysis section to make clear that the notion that the game has infinite expected value is still widely accepted in economics and philosophy.--agr 16:10, 26 January 2006 (UTC)
-
- Wow, interesting that one can still discuss so much about this old stuff. My colleagues and me, we are all very amazed... :) Obviously the expected value is infinite by mathematical proof which does *not* involve any check of the feasibility of the game whatever. So, if somebody says "Hey, that's unrealistic, because - limitied bank roll, limited time to play - etc." Then I could just answer. "Yes, so what. It's math. It's theory. It ain't reality!" :)
-
- Another thing is whether or not it is an unresolved paradox. A paradox is either logical ("This sentence is wrong.") or - as it is often used as well - it is a fact derived from reasonable assumptions that obviously conflicts with reality. In the latter case (e.g. here), one might then ask whether the initial assumptions where maybe not so reasonable. That lead in the case of the St. Petersburg paradox to the development of utility theory.
-
- Now, a more technical remark: "In modern economic terms, the Saint Petersburg Game is easily rephrased to pay out in utility units" This is only true for unlitmied utility functions. There is a lot of evidence nowadays that this is not a particularly good assumption. (By the way, bounded utility functions do not have do be constant above a certain value, but can be monotone increasing everywhere. - That is a point that the Stanford encyclopedia got wrong...) There is also another way out: Allowing only for lotteries with finite expected return, since that would be a reasonable restriction to real-life lotteries. Then one can prove that sublimear utility functions exclude any ocurence of St.Petersburg-type paradoxons. (That has been proven by Arrow.) So, basically adapting one of the assumptions for the game to real-life, makes the lottery disappear without thinking about whether it is feasible. Hence the paradox is gone, since what is left is a pure mathematical fact that there is a lottery (i.e. probability measure) with infinite expectation value. - And that is not so paradox, I think...
-
- Marc 129.132.146.72 12:10, 27 January 2006 (UTC)
[edit] Subsection on expected number of tosses
I have reluctantly removed the well-intentioned subsection, written by Taisuke Maekawa of Japan, because knowing the expected number of tosses dose not provide a way to evaluate the game. The expected number of tosses is indeed 2 as he says, but since the value of the pot double with each toss, the value of the game is not the expected number of tosses time the initial bet. Here is the deleted section:
(2) In fact, from the problem sentence.
Because 2 k − 1 dollars are gotten where the frequency of toss is assumed to be k, it only has to calculate 2 k − 1 dollars after figuring out the expected value of k at one game.
Let the expected value assumed to be μ. The expected value μ of frequency of coin toss when playing one game is the sum total of frequencies multiplied by the probability that the frequency appears from tossing once to infinite times.
From the expected value of the frequency μ, the amount of money expected is 2 dollars
because 2μ − 1 = 2.
Thus, the amount of money that can be expected is obtained.
--agr 14:19, 23 January 2006 (UTC)
- Could somebody with a little knowledge of Japanese correct this also on the Japanese version as well - I saw the wrong formulas there, but that's as much as I could get ;-) and there was no discussion page...
Marc84.72.31.59 19:19, 23 January 2006 (UTC)
--T.Maekawa 26 Jan 2006
Your criticisms have no worth.
You misunderstand.
Could you understand?
- Please look at the paper you submitted as a reference http://www.date.hu/efita2003/centre/pdf/170.pdf. On page 868, in the middle of the page, it shows the same calculation that you do, but it says "The average number of runs of a play ... amounts to 2 (SCHEID 1992, p. 1141). This does not imply that the average prize is a2 = 2(2-1) = 2." This appears to contradict what you have written. Perhaps this is the misunderstanding.---agr 03:13, 27 January 2006 (UTC)
T.Maekawa 27 Jan 2006
Whatever you would write,the knowing personkind know that is mathematically correct. These are empty arguments. I don't care anymore. OR please show me a proof that the analysis is mathematically incorrect.
Calm down, nothing personal. The proof that it is incorrect has been already given above. Marc 129.132.146.72 11:54, 27 January 2006 (UTC)
Think again. It's simple. T.Maekawa 27 Jan 2006
I thought again. Sorry! Marc129.132.146.72 15:53, 27 January 2006 (UTC)
The "expected frequency of toss" is called the "expected number of tosses" (technical term). It is an elementary fact of probability that the expected number of tosses of a game does not enable one to find the expected value, except under very special conditions ("linearity" of payoff) that do not apply here since the payoff is exponential, not linear. A linear payoff has the form ak+b where k is the number of tosses in a game and a, b are constants. Zaslav 00:52, 30 January 2006 (UTC)
Yes, but unfortunately Mr. Maekawa seems to be resistant to this (or any other) argument... :( Rieger 07:15, 30 January 2006 (UTC)
- I removed this section again. Perhaps Maekawa San can find a colleague with a better command on English who explain our concerns. --agr 04:03, 8 February 2006 (UTC)
[edit] Update of page
I did a major update. In particular I changed the structure: I found it a little bit unnatural to divide between "economical" and "mathematical" analysis, since both use math and just study two different solution strategies ("utility" vs. "finite game" vs. "iterated game"). I hope nobody is angry with my changes and I am very much looking forward to ideas how to improve the page further! Marc129.132.146.72 15:50, 27 January 2006 (UTC)
Yet, Didn't you see another (my) answer? I am not angry but I feel sorry that the another mathematical analysis was deleted. It is really simple and most proper answer. You made deletion. But this wikipedia is free to rewrite. Since there are people such as you, it is really sorry. I think I or the other people will repost the answer someday. I'm very tired now. T.Maekawa 28 Jan 2006
- Dear Taisuke, there can be several valuable opinions about anything, but mathematical computations. There was a flaw in your computation as has been pointed out to you before. Please do not post this again. Thank you.
Marc 83.77.229.46 20:37, 29 January 2006 (UTC)
There's no flaw, which can be proved with probability theory. I just would like to show the truth. T.Maekawa 8 Feb 2006
I'm not so happy with the two latest changes: (1) In my opinion, the listing of a googleaire is quite unnecessary and adds just a lengthy paragraph (since the word has to be explained - I didn't know it either). hence I would strongly recommend to delete it again. (2) The discrepancy between prediction of the naive expected value decision model and the real life decisions in the finite St.Petersburg game is being watered down with every change. Yes, it's not infinite here and $3 there, but it's still a factor of order 10.
- The reason for the googleaire example is to show that the value of the St. Petersburg game is bounded by the laws of physics as well as the laws of macroeconomics and that therefore the assumption of an infinite casino is unjustifiable. I think that is an important point to make.
- As for your second concern, if I understand it right, I think the problem is the sentence "In practice, no reasonable person would pay more than a few dollars to enter." Certainly in the context of the infinite casino version, that is a safe thing to say. Any sum offered pales in comparison to an infinite payoff. But in the finite game, it seems to me one must be much more careful. The first obvious question is how much is "a few"? $3? $10? $20? The second question is how do we know? Have there been proper surveys? What information were the participants given if there were?
- Suppose someone created an actual St. Petersburg Lottery with a million dollar max payout per ticket and auctioned tickets off in large numbers, say on an eBay style web site. Each ticket would have a serial number. After the auction closed, the tickets would be issued to the winners and then a drawing would be held that would determine the value of each ticket based on the rules of the St. Petersburg game. (There is an easy and safe way to do this using a cryptographic hash function: hash the drawing results with the serial number and use the bits in the hash as the coin toss results.) A proponent of the naive expected value decision model would argue that the selling price of these tickets would be close to their expected value of $10. I'd mortgage my house to buy as many tickets as I could get if they were selling at $3.
- What about tickets with a billion dollar cap? They should certainly sell for more than the million dollar tickets. Maybe utility theory would come into play and they would sell below their $15 par, but one would have to do very careful studies to prove there was a discrepancy , much less attribute it to declining utility vs. other explanations (e.g. risk premium).
- In short, i think it far from obvious how one can use the finite casino St. Petersburg game to disprove the naive expected value decision model. --agr 22:28, 1 February 2006 (UTC)
-
- First, please do not mix up St. Petersburg with its iterated variant. ("Buying as many tickets as possible...")
- Second, there are surveys on how much people would like to pay, and it's much less then 10 Dollars. (I don't remember the number, but it was more like 3 Dollars.) The point is that there is no reason why the expected return should be THE rational criterion. And the St. Petersburg paradox illustrates this drastically.
- Third, this thing is a game in the mathematical sense. Practical limitations do not apply. (Disclaimer: I'm applied mathematician, so I think I'm allowed to say this...! ;-) Hence the distinction between the "real" paradox and the "finite" one.
- Finally, the googolnaire is not necessary to prove the limitations of a practical implimentation of the game, since that's pretty obvious from the second last line (and all the theory before). - Don't you think so? It does not clarify, it just adds a fancy word that needs a lengthy extra explanation.
- Okay, and now I go redo the changes of our Japanese hobby-mathematician again... :)
- Let me deal with googolnaire first. It's just an example, of course. The paragraph isn't there too explain the word but to point out that the laws of physics place an upper bound on the value of the game. I think that helps make clear that the assumption of an infinite casino differs from the usual infinite-quantity assumptions one makes in other areas, e.g. electrical engineers assuming insulators have infinite resistance, air-conditioning designers assuming the atmosphere has infinite heat capacity, ichthyologists assuming the ocean is an infinite sink for solutes. These are approximations that produce reasonable results in that doing the same calculations with realistic finite numbers does not materially affect the outcome. Not so with St. Petersburg. Perhaps the next to last line should suffice, however the claim that the infinite value of the infinite casino version has practical implications is so widespread that I think the added example is needed.
- Yes, you can view the St. Petersburg game as a mathematical exercises, but it is almost always used to prove principals in economics, where practicality is of the essence. The putative paradox is that people do not assign an infinite value to the game. Well, if the value of the game isn't infinite on any practical basis, why is that a paradox?
- That brings us to the question of valuing the finite casino version. There are two issues here: do people undervalue the game and if so why. I found one paper on line with references to some surveys [1] I'm curious how they were conducted. I accept your point that the iterated game is conceptually different from a single play, though it is far from clear the value of the former is not a good indicator of the value of the latter, particularly in the case of a $1 million cap version. The $1 million version is not that different in its payout characteristics from many state and national lotteries. --agr 19:33, 5 February 2006 (UTC)
-
- Thanks for the reference, I actually know the paper quite well... :) I didn't have a look into the quoted surveys, but it's a given that expectated return does not describe well peoples behavior, even for finite games. - But let me ask back again: Why should it be rational to play according to expected return? There is no reason (theoretical or descriptive) supporting this. The St. Petersburg Paradox just brings it nicely to the point that the expected return in fact is not a good choice as a descision model, that's all.
-
- After all these discussions I would like to work on a new page now. So let's just leave it like it is. The googolnaire doesn't hurt me if you like it and I'm quite happy with the rest now. It's informative, shows many sides of the problem, even more obscure philosophical questions and hence it's a nice article, I guess. - Time to work on something else! Rieger 00:54, 6 February 2006 (UTC)
- Thanks for your help on this article. Let me just try to briefly answer your question. People's economic behavior is a complex question that deserves rigorous thinking. The St. Petersburg paradox is a poor example to use because the infinite casino version is spurious and it far from clear that the finite version proves anything at all, since it is so similar to real world lotteries that sell quite well at a premium. --agr 15:32, 7 February 2006 (UTC)
[edit] Analysis of number of tosses
I found a flaw in my proof. that is the coin toss continue giving head to infinite times. it gives the error of infinitesimal. But T.Maekawa 2006/04/24
- I'm sorry Taisuke San, but your infinitesmal argument is not correct. See the discussion in Subsection on expected number of tosses above.--agr 12:29, 24 April 2006 (UTC)
That is clearly correct. But I have another way of thinking about infinitesimal which is infinitesimal surely has a quantity. T.Maekawa 2006/04/24
- If you have a new way to think about infinitesimals, you must publish it somewhere else. Wikipedia is not for original research.--agr 11:35, 25 April 2006 (UTC)
Why do you write such things? Why do you remove correct answer? You must reconsider what you are doing about this problem. T.Maekawa 2006/04/26
- Please see Wikipedia:No original research --agr 10:30, 26 April 2006 (UTC)
Don't compose graffito! Why don't you understand the meaning of this problem and wikipedia's policy. --T.Maekawa 2006/04/28
[edit] Accuracy dispute
T.Maekawa insists on including a section in this article titled "Analysis of number of tosses" which claims to show that the expected value of the infinite St. Petersburg game is 2. This result contradicts every written authority on the subject and is mathematically incorrect. Even if it were true, it constitutes original research that should be published elsewhere.--agr 15:43, 28 April 2006 (UTC)
It is foolish that encyclopedia doesn't have correct answer. The answer doesn't break the law of no original research.
There can be various answers for this problem. I just gave the most proper answer. T.Maekawa 2006/04/29
- Then please cite a source.--agr 18:58, 28 April 2006 (UTC)
I heard that answer is correct from a professional mathematician. T.Maekawa 2006/05/01
- Sorry, that does not count as a published source for Wikipedia. But if you can put this person in contact with us, perhaps we can explain what our concern is here and this person can explain it to you. --agr 03:40, 1 May 2006 (UTC)
That is not discussion. Just it is contradiction. I will not tell you. T.Maekawa 1 May 2006
[edit] You never lose though...
I'm confused as to why this is a paradox. I would rationally agree to pay ANY amount to play this game, because as far as I can tell, it's impossible to lose any money at all playing it. Is this true? Let's say I put down a duodecillion dollars down as my entry fee, more money than has ever actually existed anywhere, and the sequence ends on the first toss.
I have lost nothing. I get back my entry fee, I just didn't gain anything.
So if there's a chance of winning, and even of winning big, and effectively ZERO cost to entry, why would I ever balk at putting down all the money I own? I mean, I could get lots back! And my initial stake is completely safe. It's as if I didn't put anything up at all, and this casino is giving away free money.
If somehow it's possible to lose money playing this game, could you explain it? I don't see it in the article... Fieari 18:26, 28 August 2006 (UTC)
It's not possible to lose money in this game, unless you pay an entrance fee to play. Then you stand to lose that entrance fee. The question is: how much would you pay to play?--agr 18:46, 28 August 2006 (UTC)
- Ah! My confusion was that I assumed that the entrance fee WAS the starting pot. That's what tripped me up. Fieari 19:09, 28 August 2006 (UTC)
[edit] How is this an Average?
I'm rather confused about how this is supposed to represent the average outcome. It looks to me like this is just the sum of all possible outcomes. Shouldn't the average be the sum of all the outcomes divided by the number of trials? i.e. shouldn't it look like this:
or if I've confused how sigma notation works, it might also be this:
Ziiv 09:01, 23 November 2006 (UTC)
-
- Hi Ziiv. You should look at the first line of the calculation first; it goes
- To find the average of something that is 1, 2, 3, 4, 5 or 6 with equal probabilities, you say , or . Here, the 'es can be interpreted as the probability of each outcome. The principle in the calculation of E is the same.
- Hi Ziiv. You should look at the first line of the calculation first; it goes
-
- Some math teachers might object to the result "", insisting that it is undefined, but for the purposes of a general encyclopedia, I think it is fine as it stands.--Niels Ø 12:47, 23 November 2006 (UTC)