Talk:Probability
From Wikipedia, the free encyclopedia
I've moved the existing talk page to Talk:Probability/Archive1, so the edit history is now with the archive page. I've copied back the most recent thread. Hope this helps, Wile E. Heresiarch 04:44, 10 Aug 2004 (UTC)
[edit] Game theory and probability
While I am by no means an expert, I have read the book by Von Neumann and Morgenstern, and I have a hard time understanding how game theory is "strictly based on probability." I'd even go so far as to say that that statement is complete rubbish - game theory may apply probability in solution concepts, but it isn't "based on probability." This is even obvious from the article on game theory itself. Zalle 14:02, 2 January 2007 (UTC)
[edit] Laws of Probability
In the section "Formalizing probability" a list of three statements is given, and it is implied that the listed statements are the "laws of probability." While the statements are true, they are not the same as the Kolmogorov axioms that the are linked to in the paragraph. I'd say it's fairly misleading to imply they are. Zalle 13:50, 2 January 2007 (UTC)
[edit] Law of Large Numbers
First, good job on the entry to all....
Now, the Law of Large Numbers; though many texts butcher this deliberately to avoid tedious explanations to the average freshman, the law of large numbers is not the limit stated in this entry. Upon reflection, one can even see that the limit stated isn't well-defined. Fortunately, you present Stoppard's scene that so poignantly illustrates the issues involved with using the law of large numbers to interpret probabilities; the exact issue of it not being a guarantee of convergence. I have an edit which I present for discussion:
... As N gets larger and larger, we expect that in our example the ratio NH/N will get closer and closer to the probability of a single coin flip being heads. Most casual observers are even willing to define the probability Pr(H) of flipping heads as the mathematical limit, as N approaches infinity, of this sequence of ratios:
In actual practice, of course, we cannot flip a coin an infinite number of times; so in general, this formula most accurately applies to situations in which we have already assigned an a priori probability to a particular outcome (in this case, our assumption that the coin was a "fair" coin). Furthermore, mathematically speaking, the above limit is not well-defined; the law of large numbers is a little more convoluted and dependent upon already having some definition of probability. The theorem states that, given Pr(H) and any arbitrarily small probability ε and difference δ, there exists some number n such that for all N > n,
In other words, by saying that "the probability of heads is 1/2", this law asserts that if we flip our coin often enough, it becomes more and more likely that the number of heads over the number of total flips will become arbitrarily close to 1/2. Unfortunately, this theorem says that the ratio will probably get close to the stated probability, and provides no guarantees of convergence.
This aspect of the law of large numbers is sometimes troubling when applied to real world situations. ...
--Tlee 03:44, 13 Apr 2004 (UTC)
- Since NH/N is just the sample mean of a Bernoulli random variable, the strong law of large numbers should guarantee the convergence of NH/N to the mean, Pr(H). That is, convergence will occur almost surely, or equivalently
- Nonetheless, I agree that the way probability is `defined' in the current version of the article needs some refinement, and other than the above comment, I like what you've got, Tlee. I'd say just go ahead and make the edit!
- --Ben Cairns 23:47, 15 Apr 2004 (UTC)
[edit] Probability in mathematics
For the reasons stated above, the "definition" given in this section is wrong and misleading for any serious readers. I would try to rewrite this section soon. Are there any objections? The Infidel 18:20, 21 February 2006 (UTC)
- Actually, I dislike all my tries as soon as I write them down. But there will be a new version real soon now. The Infidel 22:31, 25 February 2006 (UTC)
[edit] Poker
This article talks about how probability applies to poker.
[edit] Accuracy of Probability
This section is very unclear, and probably (sic) wrong.
The statement that, "One in a thousand probability played a thousand times has only a one in a million chance of failure," is patently false. An event with a .001 probability, repeated one thousand times, independently, has a 36.7% chance of not occurring in those thousand trials. That's a far cry from one in a million. I'm not sure what the author was trying to describe by this section, but it didn't work.
[edit] Good Job
This is a very nice article. I was very happy to see the distinction between 0 probability events and impossible events. :) --pippo2001 21:47, 5 Jun 2005 (UTC)
[edit] Entropy?
Would it be worthwhile to extend the physical interpretation of probability to the statistical-mechanics definition of entropy, which suggests that in general, the highest-probability configuration of (insert whatever) is the one generally achieved, and this tends to maximize the disorder in the system? (eg. air molecules in a room - the highest probability outcome is the one in which the number of atoms is about equal per unit volume; another way of saying this is that it is entirely possible that all the air molecules at once could shift to one corner of the room but the length of time required to observe for this low-probability outcome is longer than some absurdly high number). --24.80.119.229 05:25, 19 August 2005 (UTC)
- Yes I think it would be a good idea to explain the difference between how the physicist and the mathematician view this. In essence, a physicist is someone who would consider an event with a very very small probability (but larger than zero) as totally impossible, while a mathematician is someone who would consider some events with exactly zero probability as possible anyway. iNic 17:24, 1 March 2007 (UTC)
[edit] Request for help
Could someone here help arbitrate on a discussion regarding probability on the Answers in Genesis talk page. I would like someone who feels they can be independent and fair to both sides. This page includes religious discussions and is somewhat controversial, but I don't think this should affect the specific discussion regarding the probability of an event. You can also leave a message on my talk page if you have any questions. Thanks Christianjb 03:26, 6 December 2005 (UTC)
[edit] Odds
The person who wrote up odds as currently displayed on the page got it wrong. A 2:1 event is not an event with probability 2/3; it is a 1/3 probability event. The formula a/(a+b) needs to be replaced with b/(a+b) and is probably best illustrated with a reference to gambling.
[edit] certainty
certainty redirects here, but im gonna steal it for an epistemology acrticle. will do a disambig Spencerk 08:01, 26 March 2006 (UTC)
[edit] Theory of errors
I submit that the memoir De erroribus (1823) by C F Gauss should be mentioned. Gauss derives the error function connected with his name, from only three assumptions.
(C Lomnitz, Mexico City)
[edit] A is certain implies P(A) = 1
One of the first sentences on the article is:
"If probability is equal to 1 then that event is certain to happen and if the probability is 0 then that event will never occur."
But, later on:
"An impossible event has a probability of exactly 0, and a certain event has a probability of 1, but the converses are not always true"
The first sentence is not correct (the second one is). Perhaps some clarification is needed.
They are both true are they not? Both say:
If P(A) = 0 that is impossible
If P(A) = 1 that is certain.
If I am getting this wrong can you please explain why.
--212.58.233.129
- In a uniform distribution on the interval [0,1], every point has the same probability, which must be 0. Do you think this means that every point is impossible? --Zundark 17:13, 16 January 2007 (UTC)
- It's been a while, but I think , so the probability of each x in a uniform distribution is 1. I think the answer to 212.58.233.129's question is that an event of probability 1 does not imply that it is inevitable, and having a probability of 0 doesn't imply that it will never happen. Wake 03:09, 24 February 2007 (UTC)
"on the interval [0,1], every point has the same probability, which must be 0" Why must it be 0? Sorry I am not an expert in maths. Also I am failing to understand why this is wrong. "If probability is equal to 1 then that event is certain to happen and if the probability is 0 then that event will never occur."
and what exactly does the last part fo this mean. "An impossible event has a probability of exactly 0, and a certain event has a probability of 1, but the converses are not always true" What are the opposites here? A possible event doesnt have a probability of 0, and a uncertain event isnt 1. Surely this is true. Please can someone explain this. I have basic probability knowledge and would like to understand more into depth the theories and ideas of probabilities.
- loosely speaking the probability of an event is the number of positive outcomes divided by the number of all possible outcomes. Throw a dice: probability of getting a 3 is 1/6. But now consider something where you have an infinite number of possible outcomes: pick a random number between 3 and 4, the probability of getting exactly pi is 1/infinity = 0. However it is not *impossible* to get pi, just very very unlikely. Of course this is very much a theoretical concept, since in pretty much all real world applications you have only a finite number of possible outcomes.193.109.51.150 23:37, 23 February 2007 (UTC)
[edit] Luck
There should be a mention of probability in the luck article, or is there no connection between luck and probability? ISn't luck after all probability resulting in our favour?
Is there enough to say on luck? I have tried to look for a specific definition for luck, but cannot get one in the context of probability. All i can think of it that luck is for an unlikely favourable outcome(although bad luck is an unlikely unfavourable outcome). I would also mention sometime about that good luck would only be discribed in the short term, as probabilities usually even itself out in the long run. I mean if you hit the 35-1 odds on the roulette table, but was your 38th time you played and never hit, that would not be lucky. However if you hit the same number 38 times in a row on a 35-1 shot, that would be extremely lucky. 16 Jan 2007
[edit] Three types of probability
Please can an expert add some content to this page on the frequentist, propensity and subjectivist (Bayesian) interpretations of probability as this seems pretty crucial but is missing from the article. There is a starting point here. Thanks Andeggs 15:32, 23 December 2006 (UTC)
Is it quite true that subjective probability is the same as Bayesianism? I was under the impression that Bayesianism is rather a mathematical theory which is an (or perhaps the only current) interpretation of subjective probability. Ben Finn 18:43, 12 January 2007 (UTC)
[edit] New Section on Physical Chance?
I published a book on physical probability (chance) recently, and thought I would write a section for this article on chance. At present there is just a very brief section on the propensity interpretation. Or should this be a separate article?
By the way, Bayesianism is an approach to scientific reasoning based on the idea that a well-confirmed theory is one that presently has high subjective probability. The name comes from the fact that Bayes's theorem is a central tool in the calculation of the probability of a theory given the evidence.
Richardajohns 23:12, 6 March 2007 (UTC)richardajohns
-
- I think there ought to be a separate article on chance (including propensity). There's plenty to say about it. Currently I think the whole treatment of the different varieties of probability in connected articles is fairly poor, and could do with improvement & a clean up. Ben Finn 10:09, 20 March 2007 (UTC)
[edit] Moved
I moved this information from probability theory, because it seems like it would fit a lot better into this article. However, I don't understand this text well enough to merge it. Can anyone help me merge it in? MisterSheik 17:55, 28 February 2007 (UTC)
There are different ways to interpret probability. Frequentists will assign probabilities only to events that are random, i.e., random variables, that are outcomes of actual or theoretical experiments. On the other hand, Bayesians assign probabilities to propositions that are uncertain according either to subjective degrees of belief in their truth, or to logically justifiable degrees of belief in their truth. Among statisticians and philosophers, many more distinctions are drawn beyond this subjective/objective divide. See the article on interpretations of probability at the Stanford Encyclopedia of Philosophy: [1].
A Bayesian may assign a probability to the proposition that 'there was life on Mars a billion years ago', since that is uncertain, whereas a frequentist would not assign probabilities to statements at all. A frequentist is actually unable to technically interpret such uses of the probability concept, even though 'probability' is often used in this way in colloquial speech. Frequentists only assign probabilities to outcomes of well defined random experiments, that is, where there is a defined sample space as defined above in the theory section. For another illustration of the differences see the two envelopes problem.
Situations do arise where probability theory is somewhat lacking. One method of attempting to circumvent this indeterminancy is the theory of super-probability[citation needed], in which situations are given integer values greater than 1. This is an extension of the multi-dimensional space intrinsic to M-theory and modern theoretical physics.
- What is lacking here, I think, is a good common effort to turn the Probability interpretations page into a good article. The paragraphs above should fit nicely into an improved probability interpretations page. That is in fact also true for a large part of the current article. It should be moved to the interpretations page and removed from this page. I'm not sure what would be appropriate to have in this article actually, that wouldn't fit better under some other heading. Maybe the best thing to do with Probability plain and simple is to turn it into an disambiguation page? iNic 16:42, 7 March 2007 (UTC)
[edit] Aliens?
I've removed this seemingly nonsensical statement from the article:
For example, the probability that aliens will come and destroy us is x to 1 because there are so many different possibilities (nice aliens, mean but weaker aliens, non existing aliens)/
Brian Jason Drake 04:32, 11 March 2007 (UTC)
[edit] Spammed external link
This link: http://www.giacomo.lorenzoni.name/arganprobstat/
has been spammed by multiple IP addresses across articles on several European language Wikipedias. Could regular editors of this article take a look at it to see if it adds high quality, unique information to the external links section in keeping with our guidelines. Thanks -- Siobhan Hansa 14:54, 14 March 2007 (UTC)
[edit] Merge Probability theory to here
Since Probability theory has been reduced to a stub, it may be best to redirect it to here, after merging any valuable information left not already present here (possibly none – didn't check). --LambiamTalk 10:20, 21 March 2007 (UTC)
- I think we should do nearly the opposite thing: move most of the contents in this article to the Probability theory page and parts of it to the Probability interpretations page. This page, Probability, I think suits best as an disambiguation page, as "probability" can mean so many more things than just the mathematical theory. What do you think? iNic 16:26, 21 March 2007 (UTC)
-
- That sounds really good to me... MisterSheik 16:40, 21 March 2007 (UTC)