Kolmogorov's inequality

From Wikipedia, the free encyclopedia

In probability theory, Kolmogorov's inequality is a so-called "maximal inequality" that gives a bound on the probability that the partial sums of a finite collection of independent random variables exceed some specified bound. The inequality is named after the Russian mathematician Andrey Kolmogorov.

[edit] Statement of the inequality

Let X_1,\dots, X_n be independent random variables defined on a common probability space, such that \operatorname{E} X_k=0 and \operatorname{Var} \, X_k <\infty for k=1,\dots, n. Then, for each λ > 0,

P\left(\max_{1\leq k\leq n} | S_k|\geq\lambda\right)\leq \frac{1}{\lambda^2} \operatorname{Var}\,S_n = \frac{1}{\lambda^2}\sum_{k=1}^n \operatorname{Var}\,X_k,

where S_k = X_1 +\cdots + X_k.

[edit] See also

[edit] References

  • Billingsley, Patrick (1995). Probability and Measure. New York: John Wiley & Sons, Inc.. ISBN 0-471-00710-2.  (Theorem 22.4)
  • Feller, William [1950] (1968). An Introduction to Probability Theory and its Applications, Vol 1, Third Edition (in English), New York: John Wiley & Sons, Inc., xviii+509. ISBN 0-471-25708-7. 

This article incorporates material from Kolmogorov's inequality on PlanetMath, which is licensed under the GFDL.

In other languages