Minimum-variance unbiased estimator
From Wikipedia, the free encyclopedia
In statistics a uniformly minimum-variance unbiased estimator or minimum-variance unbiased estimator (often abbreviated as UMVU or MVUE) is an unbiased estimator that has lower variance than any other unbiased estimator for all possible values of the parameter. Consider estimation of g(θ) based on data i.i.d. from some family of densities , where Ω is the parameter space. An unbiased estimator of g(θ) is UMVU if ,
for any other unbiased estimator
If an unbiased estimator of g(θ) exists, then one can prove there is an essentially unique MVUE estimator. Using the Rao-Blackwell theorem one can also prove that determining the MVUE estimator is simply a matter of finding a complete sufficient statistic for the family and conditioning any unbiased estimator on it. Put formally, suppose is unbiased for g(θ), and that T is a complete sufficient statistic for the family of densities. Then
is the MVUE estimator for g(θ).
Contents |
[edit] Estimator selection
An efficient estimator need not exist, but if it does, it's the MVUE. Since the mean squared error (MSE) of an estimator δ is
the MVUE minimizes MSE among unbiased estimators. In some cases biased estimators have lower MSE; see estimator bias.
[edit] Example
Consider the data to be a single observation from an absolutely continuous distribution on with density
and we wish to find the UMVU estimator of
First we recognize that the density can be written as
Which is an exponential family with sufficient statistic T = log(1 + e − x). In fact this is a full rank exponential family, and therefore T is complete sufficient. See exponential family for a derivation which shows
Therefore
Clearly is unbiased, thus the UMVU estimator is
This example illustrates that an unbiased function of the complete sufficient statistic will be UMVU.
[edit] See also
[edit] References
- Keener, Robert W. (2006). Statistical Theory: Notes for a Course in Theoretical Statistics. Springer, 47-48, 57-58.