Talk:Probability

From Wikipedia, the free encyclopedia

WikiProject Mathematics This article is within the scope of WikiProject Mathematics.
Mathematics grading: B Class Top Importance  Field: Probability and statistics
Needs less technical introduction to "Probability in mathematics". Links and brief exmplanations to major areas would be good. Tompw

I've moved the existing talk page to Talk:Probability/Archive1, so the edit history is now with the archive page. I've copied back the most recent thread. Hope this helps, Wile E. Heresiarch 04:44, 10 Aug 2004 (UTC)

Contents

[edit] Law of Large Numbers

First, good job on the entry to all....

Now, the Law of Large Numbers; though many texts butcher this deliberately to avoid tedious explanations to the average freshman, the law of large numbers is not the limit stated in this entry. Upon reflection, one can even see that the limit stated isn't well-defined. Fortunately, you present Stoppard's scene that so poignantly illustrates the issues involved with using the law of large numbers to interpret probabilities; the exact issue of it not being a guarantee of convergence. I have an edit which I present for discussion:

... As N gets larger and larger, we expect that in our example the ratio NH/N will get closer and closer to the probability of a single coin flip being heads. Most casual observers are even willing to define the probability Pr(H) of flipping heads as the mathematical limit, as N approaches infinity, of this sequence of ratios:

\Pr(H) = \lim_{N \to \infty}{N_H \over N}

In actual practice, of course, we cannot flip a coin an infinite number of times; so in general, this formula most accurately applies to situations in which we have already assigned an a priori probability to a particular outcome (in this case, our assumption that the coin was a "fair" coin). Furthermore, mathematically speaking, the above limit is not well-defined; the law of large numbers is a little more convoluted and dependent upon already having some definition of probability. The theorem states that, given Pr(H) and any arbitrarily small probability ε and difference δ, there exists some number n such that for all N > n,

Pr\left( \left|\Pr(H) - {N_H \over N}\right|>\delta \right) <\epsilon

In other words, by saying that "the probability of heads is 1/2", this law asserts that if we flip our coin often enough, it becomes more and more likely that the number of heads over the number of total flips will become arbitrarily close to 1/2. Unfortunately, this theorem says that the ratio will probably get close to the stated probability, and provides no guarantees of convergence.

This aspect of the law of large numbers is sometimes troubling when applied to real world situations. ...

--Tlee 03:44, 13 Apr 2004 (UTC)


Since NH/N is just the sample mean of a Bernoulli random variable, the strong law of large numbers should guarantee the convergence of NH/N to the mean, Pr(H). That is, convergence will occur almost surely, or equivalently
\Pr\left( \lim_{N\rightarrow\infty} {N_H \over N} = \Pr(H) \right) = 1.
Nonetheless, I agree that the way probability is `defined' in the current version of the article needs some refinement, and other than the above comment, I like what you've got, Tlee. I'd say just go ahead and make the edit!
--Ben Cairns 23:47, 15 Apr 2004 (UTC)

[edit] Probability in mathematics

For the reasons stated above, the "definition" given in this section is wrong and misleading for any serious readers. I would try to rewrite this section soon. Are there any objections? The Infidel 18:20, 21 February 2006 (UTC)

Actually, I dislike all my tries as soon as I write them down. But there will be a new version real soon now. The Infidel 22:31, 25 February 2006 (UTC)

[edit] Poker

This article talks about how probability applies to poker.

[edit] Accuracy of Probability

This section is very unclear, and probably (sic) wrong.

The statement that, "One in a thousand probability played a thousand times has only a one in a million chance of failure," is patently false. An event with a .001 probability, repeated one thousand times, independently, has a 36.7% chance of not occurring in those thousand trials. That's a far cry from one in a million. I'm not sure what the author was trying to describe by this section, but it didn't work.

[edit] Good Job

This is a very nice article. I was very happy to see the distinction between 0 probability events and impossible events. :) --pippo2001 21:47, 5 Jun 2005 (UTC)

[edit] Entropy?

Would it be worthwhile to extend the physical interpretation of probability to the statistical-mechanics definition of entropy, which suggests that in general, the highest-probability configuration of (insert whatever) is the one generally achieved, and this tends to maximize the disorder in the system? (eg. air molecules in a room - the highest probability outcome is the one in which the number of atoms is about equal per unit volume; another way of saying this is that it is entirely possible that all the air molecules at once could shift to one corner of the room but the length of time required to observe for this low-probability outcome is longer than some absurdly high number). --24.80.119.229 05:25, 19 August 2005 (UTC)

[edit] Request for help

Could someone here help arbitrate on a discussion regarding probability on the Answers in Genesis talk page. I would like someone who feels they can be independent and fair to both sides. This page includes religious discussions and is somewhat controversial, but I don't think this should affect the specific discussion regarding the probability of an event. You can also leave a message on my talk page if you have any questions. Thanks Christianjb 03:26, 6 December 2005 (UTC)

[edit] Odds

The person who wrote up odds as currently displayed on the page got it wrong. A 2:1 event is not an event with probability 2/3; it is a 1/3 probability event. The formula a/(a+b) needs to be replaced with b/(a+b) and is probably best illustrated with a reference to gambling.

[edit] certainty

certainty redirects here, but im gonna steal it for an epistemology acrticle. will do a disambig Spencerk 08:01, 26 March 2006 (UTC)

[edit] Theory of errors

I submit that the memoir De erroribus (1823) by C F Gauss should be mentioned. Gauss derives the error function connected with his name, from only three assumptions.

(C Lomnitz, Mexico City)

[edit] A is certain implies P(A) = 1

One of the first sentences on the article is:

"If probability is equal to 1 then that event is certain to happen and if the probability is 0 then that event will never occur."

But, later on:

"An impossible event has a probability of exactly 0, and a certain event has a probability of 1, but the converses are not always true"

The first sentence is not correct (the second one is). Perhaps some clarification is needed.

[edit] Luck

There should be a mention of probability in the luck article, or is there no connection between luck and probability? ISn't luck after all probability resulting in our favour?