Lévy's convergence theorem

From Wikipedia, the free encyclopedia

In probability theory Lévy's convergence theorem (sometimes also called Lévy's dominated convergence theorem) states that for a sequence of random variables (X_n)^\infty_{n=1} where

  • X_n\xrightarrow{a.s.} X and
  • | Xn | < Y, where Y is some random variable with
  • \mathrm{E}Y < \infty

it follows that

  •  \mathrm{E}|X| < \infty,
  • \mathrm{E}X_n\to \mathrm{E} X
  • \mathrm{E} |X-X_n|\to 0.

Essentially, it is a sufficient condition for the almost sure convergence to imply L1-convergence. The condition |X_n| < Y,\;  \mathrm{E}Y < \infty could be relaxed. Instead, the sequence (X_n)^\infty_{n=1} should be uniformly integrable.

The theorem is simply a special case of Lebesgue's dominated convergence theorem in measure theory.

[edit] See also

[edit] References

  • A.N.Shiryaev (1995). Probability, 2nd Edition, Springer-Verlag, New York, pp.187-188, ISBN 978-0387945491