Minimum-variance unbiased estimator

From Wikipedia, the free encyclopedia

In statistics, and more specifically in estimation theory, a minimum-variance unbiased estimator (MVUE or MVU estimator) is an unbiased estimator of parameters, whose variance is minimized for all values of the parameters. If an estimator is unbiased, then its mean squared error is equal to its variance, i.e.,

\mathrm{mse} \left( \widehat{\theta} \right) = \mathrm{var} \left( \widehat{\theta} \right).

This follows immediately from the fact that the mean squared error is the sum of the variance and the square of the bias:

\mathrm{mse} \left( \widehat{\theta} \right) = \mathrm{E} \left[  \left(   \widehat{\theta} - \theta  \right)^2 \right] = \mathrm{var} \left( \widehat{\theta} \right) + \mathrm{bias} \left( \widehat\theta \right)^2.

Consequently, if an estimator is unbiased, then minimizing its mean squared error is the same as minimizing its variance.

In many cases, a biased estimator can have a uniformly smaller mean squared error than does any unbiased estimator of the same parameter. See estimator bias for information.

If a MVUE is a complete statistic, then it is the only MVUE. In many cases, the Lehmann-Scheffé theorem can be used to show that an estimator is the unique MVUE. Constructing such an estimator is often done by relying on the Rao-Blackwell theorem.