Talk:Rao–Blackwell theorem

From Wikipedia, the free encyclopedia

This article is within the scope of WikiProject Statistics, which collaborates to improve Wikipedia's coverage of statistics. If you would like to participate, please visit the project page.

Rao–Blackwell theorem is a former featured article candidate. Please view the links under Article milestones below to see why the nomination failed. For older candidates, please check the archive.
March 1, 2004 Featured article candidate Not promoted

[edit] Lehmann-Scheffé minimum variance

In the article states that if the NEW estimator is complete and sufficient then it is the minimum variance. But doesn't the Lehmann-Scheffé deal specifically with using a complete and sufficent statistic to find a new estimator given an unbiased estimator? ZioX 22:51, 21 March 2007 (UTC)

Looks as if it ought to say if the statistic on which you condition is complete and sufficient, and the estimator you start with is unbiased, then the Rao-Blackwell estimator is the best unbiased estimator. Michael Hardy 22:37, 21 March 2007 (UTC)
Yes, that's what I figured. I didn't want to change without saying anything. ZioX 22:51, 21 March 2007 (UTC)
Changed it. ZioX 21:05, 22 March 2007 (UTC)

[edit] Example

Calculating delta_1 is not as trivial as it's being made out to be. At least not to the casual reader. Perhaps something should be said about X_1|sum(X_i) ~ Bin(sum(X_i),1/n)? ZioX 22:56, 21 March 2007 (UTC)