Method of moments (probability theory)
From Wikipedia, the free encyclopedia
See method of moments (statistics) for an account of a method of parameter estimation.
In probability theory, the method of moments is a way of proving convergence in distribution by proving convergence of a sequence of moment sequences. Suppose X is a random variable and that all of the moments
exist. Further suppose the probability distribution of X is completely determined by its moments, i.e., there is no other probability distribution with the same sequence of moments (cf. the problem of moments). If
for all values of k, then the sequence {Xn} converges to X in distribution.
The method of moments is especially useful for proving limits theorems for random matrices with independent entries, such as Wigner's semi-circle law.