Minimum-variance unbiased estimator

From Wikipedia, the free encyclopedia

In statistics a uniformly minimum-variance unbiased estimator or minimum-variance unbiased estimator (often abbreviated as UMVU or MVUE) is an unbiased estimator that has lower variance than any other unbiased estimator for all possible values of the parameter. Consider estimation of g(θ) based on data X_1, X_2, \ldots, X_n i.i.d. from some family of densities  p_\theta, \theta \in \Omega, where Ω is the parameter space. An unbiased estimator \delta(X_1, X_2, \ldots, X_n) of g(θ) is UMVU if  \forall \theta \in \Omega,

 \mathrm{var}(\delta(X_1, X_2, \ldots, X_n)) \leq \mathrm{var}(\tilde{\delta}(X_1, X_2, \ldots, X_n))

for any other unbiased estimator  \tilde{\delta}.

If an unbiased estimator of g(θ) exists, then one can prove there is an essentially unique MVUE estimator. Using the Rao-Blackwell theorem one can also prove that determining the MVUE estimator is simply a matter of finding a complete sufficient statistic for the family p_\theta, \theta \in \Omega and conditioning any unbiased estimator on it. Put formally, suppose \delta(X_1, X_2, \ldots, X_n) is unbiased for g(θ), and that T is a complete sufficient statistic for the family of densities. Then

 \eta(X_1, X_2, \ldots, X_n) = \mathrm{E}(\delta(X_1, X_2, \ldots, X_n)|T)\,

is the MVUE estimator for g(θ).

Contents

[edit] Estimator selection

An efficient estimator need not exist, but if it does, it's the MVUE. Since the mean squared error (MSE) of an estimator δ is

 MSE(\delta) = \mathrm{var}(\delta) + \mathrm{bias}(\delta)^{2}\

the MVUE minimizes MSE among unbiased estimators. In some cases biased estimators have lower MSE; see estimator bias.

[edit] Example

Consider the data to be a single observation from an absolutely continuous distribution on \mathbb{R} with density

 p_\theta(x) = \frac{ \theta e^{-x} }{(1 + e^{-x})^{\theta + 1} }

and we wish to find the UMVU estimator of

 g(\theta) = \frac{1}{\theta^{2}}

First we recognize that the density can be written as

 \frac{ e^{-x} } { 1 + e^{-x} } \exp( -\theta \log(1 + e^{-x}) + \log(\theta))

Which is an exponential family with sufficient statistic T = log(1 + e x). In fact this is a full rank exponential family, and therefore T is complete sufficient. See exponential family for a derivation which shows

 \mathrm{E}(T) = \frac{1}{\theta}, \mathrm{var}(T) = \frac{1}{\theta^{2}}

Therefore

 \mathrm{E}(T^2) = \frac{2}{\theta^{2}}

Clearly  \delta(X) = \frac{T^2}{2} is unbiased, thus the UMVU estimator is

 \eta(X) = \mathrm{E}(\delta(X) | T) = \mathrm{E}(\frac{T^2}{2} | T) = \frac{T^{2}}{2} = \frac{\log(1 + e^{-X})^{2}}{2}

This example illustrates that an unbiased function of the complete sufficient statistic will be UMVU.

[edit] See also

U-statistic

[edit] References

  • Keener, Robert W. (2006). Statistical Theory: Notes for a Course in Theoretical Statistics. Springer, 47-48, 57-58. 
Languages