Minimum mean-square error

From Wikipedia, the free encyclopedia

Minimum mean-square error (MMSE) relates to an estimator having estimates with the minimum mean squared error possible. MMSE estimators are commonly described as optimal.

Let Ô be a point estimator of the parameter θ

MSE(Ô)=E(Ô-θ)2

The mean square can be written as:

MSE(Ô)=E[Ô-E(Ô)]2+[θ-E(Ô)]2 = V(Ô)+(bias)2

So this means that the mean square error Ô equals to estimator variance and squared bias.

Suppose there are two estimators Ô1 and Ô2 of the parameter θ, then make MSE(Ô1) and MSE(Ô2) be the mean square errors.

Relative Efficiency of Ô1 and Ô2 can be defined as follow:

MSE(Ô1)/MSE(Ô2)

[edit] External links

  • Prediction and Improved Estimation in Linear Models. by J. Bibby, H. Toutenburg (Wiley, 1977) - this book looks almost exclusively at minimum mean-square error estimation and inference.
In other languages