Law of large numbers/Proof
From Wikipedia, the free encyclopedia
The name of this article may be improper for Wikipedia. To meet Wikipedia's quality standards, this article may need to be moved to a better name that matches the subject. The current name may violate one or more of Wikipedia's naming conventions, or is otherwise inaccurate. Please see the discussion on the talk page. |
It has been suggested that this article or section be merged into Law of large numbers. (Discuss) |
Given X1, X2, ... an infinite sequence of i.i.d. random variables with finite expected value E(X1) = E(X2) = ... = µ < ∞, we are interested in the convergence of the sample average
Contents |
[edit] The weak law
Theorem:
[edit] Proof using Chebyshev's inequality
This proof uses the assumption of finite variance (for all i). The independence of the random variables implies no correlation between them, and we have that
The common mean μ of the sequence is the mean of the sample average:
Using Chebyshev's inequality on results in
This may be used to obtain the following:
As n approaches infinity, the expression approaches 1. And by definition of convergence in probability (see Convergence of random variables), we have obtained
[edit] Proof using convergence of characteristic functions
By Taylor's theorem for complex functions, the characteristic function of any random variable, X, with finite mean μ, can be written as
All X1, X2, ... have the same characteristic function, so we will simply denote this φX.
Among the basic properties of characteristic functions there are
These rules can be used to calculate the characteristic function of in terms of φX:
The limit eitμ is the characteristic function of the constant random variable μ, and hence by the Lévy continuity theorem, converges in distribution to μ:
μ is a constant, which implies that convergence in distribution to μ and convergence in probability to μ are equivalent. (See Convergence of random variables) This implies that