Consistent estimator
From Wikipedia, the free encyclopedia
In statistics, a consistent sequence of estimators is one which converges in probability to the true value of the parameter. In many cases, this is referred to simply as a consistent estimator.
A sequence is said to be strongly consistent if it converges almost surely to the correct value.
Contents |
[edit] Formal definition
Consider a set of samples from a given probability distribution f with an unknown parameter , where Θ is the parameter space that is a subset of . Let be an estimator of θ based on the first n samples.
We say that the sequence of estimators {Un} is consistent (or that U is a consistent estimator of θ), if Ui converges in probability to θ for every . That is, for every ,
for all .
[edit] Properties
Suppose U is an estimator of θ such that the sequence {Un} is consistent. If and are two convergent sequences of constants with and , then the sequence {Vn}, defined by is a consistent estimate of αθ + β.
[edit] Proof
First, observe that
This implies
As , we have , , and . So the last expression goes to 0 as . Therefore,
and thus {Vn} is a consistent sequence of estimators of αθ + β. Q.E.D.
[edit] Further reading
- Lehmann, E. L.; Casella, G. (1998). Theory of Point Estimation. Springer, 2nd ed. ISBN 0-387-98502-6.