Talk:Chebyshev's inequality

From Wikipedia, the free encyclopedia

This article is within the scope of WikiProject Statistics, which collaborates to improve Wikipedia's coverage of statistics. If you would like to participate, please visit the project page.

WikiProject Mathematics
This article is within the scope of WikiProject Mathematics, which collaborates on articles related to mathematics.
Mathematics rating: Start Class High Priority  Field: Probability and statistics

Contents

[edit] Name

I wonder if we should rename this to "Chebyshev's inequality". First, Chebyshev seems to be the more common spelling of his name, and "inequality" seems to be the more common label for this result; his theorem being in the theory of approximations. AxelBoldt 00:29 Dec 11, 2002 (UTC)

Certainly fine with me. I'm hardly an expert in statistics, or in transliteration. --Ryguasu 00:51 Dec 11, 2002 (UTC)

I'm for that: Axel's right about the spelling and the name. --User:LenBudney 01:58 Dec 11, 2002 (UTC)

Tchebysheff is more like a German spelling. The correct and the most common spelling in English, as Axel said, for this great Russian mathematician is Pafnuty Lvovich Chebyshev. But theorem is quite okay for me. Best regards. --XJamRastafire 01:23 Dec 17, 2002 (UTC)

I really do not want to be a quibbler but Chebyshev's name in his article is also incorrect. If we write Pafnuti instead of Pafnuty, it is the same mistake in English if we write Yzaak Nevtonn or Jull Bryner :-). You can check the most common English spelling of Chebyshev's name at: Pafnuty Lvovich Chebyshev. The original name is "Pafnutij" but I guess "ij" in transcription becomes "y" and not "i". I should also comment very common practice here that Russian names lack "otchestvo", what is also incorrect. As I know Russians use full names and specially for their famous people. But I am repeating myself over and over again. And finally as I have already said somewhere here around that Donald Knuth maintains a list of all Russian names, he used as links and references in his books over the years. I believe we should follow his practise wher'e'er is possible. --XJamRastafire 13:25 Dec 17, 2002 (UTC)

[edit] Other inequalities

Let Χ be the mathematician in question until you sort out his name =p Then there are at least two other inequality-type Χ's theorems.

One is aka Bertrand's postulate.

Another was a step toward prime number theorem: if π(n) is the number of primes not exceeding n, then 0.92\frac n{\ln n}<\pi(n)<1.11\frac n{\ln n} [Yaglom and Yaglom problem 170; vaguely mentioned at Mathworld]. 142.177.19.171 19:12, 13 Sep 2004 (UTC)

[edit] Introductory Paragraph

I felt that the intoductory paragraph was excessively dense and technical:

'Chebyshev's inequality (also known as Tchebysheff's inequality, Chebyshev's theorem, or the Bienaymé-Chebyshev inequality) is a theorem of probability theory named in honor of Pafnuty Chebyshev. Chebyshev's inequality gives a lower bound for the probability that a value of a random variable of any distribution with finite variance lies within a certain distance from the variable's mean; equivalently, the theorem provides an upper bound for the probability that values lie outside the same distance from the mean.

It's quite correct, but I was afraid it would be difficult for a general audience to understand. This seemed a shame, since the importance of the theorem can be readily understood by a general audience. So I inserted a nontechnical paraphrase:

Chebyshev's inequality (also known as Tchebysheff's inequality, Chebyshev's theorem, or the Bienaymé-Chebyshev inequality) is a theorem of probability theory. Simply put, it states that in any data sample, nearly all the values are close to the mean value, and provides a quantitiative description of "nearly all" and "close to".
The theorem gives a lower bound for the probability that a value of a random variable of any distribution with finite variance lies within a certain distance from the variable's mean; equivalently, the theorem provides an upper bound for the probability that values lie outside the same distance from the mean.

I thought that people would be more likely to appreciate the meaning of the second paragraph if the idea was set up in less technical language first. -- Dominus 14:41, 11 August 2005 (UTC)

I found the current introductory parargaph very confusing and had to use a book to find out what this inequality really states. Clearly, there are cases where nearly all of the probability is concentrated far away from the mean, but these distrubutions have a high variance. I suggest the previous introduction (see above) is returned.

[edit] measure theoretic statement

From the point of view introduced in the measure theoretic section, the most natural Chebyshev inequality is the one obtained for g(t) = t. This simply tells you that a (say positive) random variable with a finite expected value cannot be large with high probability: quite understandable by non-math people (at least, more than second momentum). So, it would be nice to mention this basic Chebyshev inquality. Isn't it? gala.martin December 9.

Except that that is called the Markov inequality. The Chebyshev inquality is about second moment about the mean. There is a Gauss inequality for the second moment about the mode. —The preceding unsigned comment was added by 81.151.197.86 (talk) 11:37, 14 April 2007 (UTC).

[edit] whats the difference between chebyshev's inequality and chebyshev sum inequality

eh? —Preceding unsigned comment added by 68.161.204.136 (talk) 18:04, 17 December 2007 (UTC)

[edit] Who first proved it?

I removed the statement that Chebyshev was the first person to prove the inequality. Bienayme proved it (as I recall) some 20 years earlier. For those who care, here is a little more history about the inequality, and about who the inequality "should" be named after. Markov, who was Chebyshev's student, wrote a letter saying that Chebyshev deserved the credit because he understood its purpose, which was to lay the groundwork for stronger probabilistic inequalities (that culminated in the central limit theorem).

Markov also wrote that Bienayme had just proved the bound to refute something or other that Cauchy had claimed. --AS314 (talk) 15:46, 22 February 2008 (UTC)

[edit] Merger proposal

The page An inequality on location and scale parameters refers to a simple corollary of the one-sided version of Chebyshev's inequality and has a misleading (or at least non-standard) name. Searching around I can't find a "real" name for the theorem proven on that page; rather, everyone simply proves it as a natural result of Chebyshev's inequality. Romanempire (talk) 12:09, 11 June 2008 (UTC)

Keep as separate article. The inequality is about different quantities than involved in Chebyshev's inequality ... median as opposed to probabilities of typical values. By "everyone simply proves it as a natural result of Chebyshev's inequality", I guess you mean everyone except whoever put the article up. It is good to have a direct proof. Are there any elaborations of that would allow tighter bounds to be formulated? Melcombe (talk) 13:24, 11 June 2008 (UTC)
...and there isn't a proof given of the "One-sided Chebyshev inequality". Melcombe (talk) 13:37, 11 June 2008 (UTC)