Lyapunov's central limit theorem

From Wikipedia, the free encyclopedia

In probability theory, Lyapunov's central limit theorem is one of the variants of the central limit theorem. Unlike the classical central limit theorem, it does not require that the random variables in question be independent and identically distributed. It is named for the Russian mathematician Aleksandr Lyapunov.

[edit] Statement of the theorem

Let Xn, n \in \mathbb{N}, be a sequence of independent, random variables. Suppose that each Xn has finite expected value \mathbb{E} [X_{n}] = \mu_{n} and finite variance \mathrm{Var} [X_{n}] = \sigma_{n}^{2}. Suppose also that the third central moments

r_{n}^{3} := \mathbb{E} [ \left| X_{n} - \mu_{n} |^{3} \right]

are finite and satisfy the Lyapunov condition

\lim_{N \to \infty} \frac{\left( \sum_{n = 1}^{N} r_{n}^{3} \right)^{1/3}}{\left( \sum_{n = 1}^{N} \sigma_{n}^{2} \right)^{1/2}} = 0.

Let the random variable S_{N} := X_{1} + \dots + X_{N} denote the Nth partial sum of the random variables Xn. Then the normalised partial sum

Z_{N} := \frac{S_{N} - \sum_{n = 1}^{N} \mu_{n}}{\left( \sum_{n = 1}^{N} \sigma_{n}^{2} \right)^{1/2}}

converges in distribution to a standard normal random variable as N \to \infty.

Less formally, for "large" N, SN is approximately normally distributed with expected value

\mathbb{E} [S_{N}] \approx \sum_{n = 1}^{N} \mathbb{E} [X_{n}]

and variance

\mathrm{Var} [S_{N}] \approx \sum_{n = 1}^{N} \mathrm{Var} [X_{n}].

[edit] External link

In other languages