Stein's lemma

From Wikipedia, the free encyclopedia

Stein's lemma, named in honor of Charles Stein, is a theorem of probability theory that is of interest primarily because of its application to statistical inference — in particular, its application to James-Stein estimation and empirical Bayes methods.

[edit] Statement of the lemma

Suppose X is a normally distributed random variable with expectation μ and variance σ2. Further suppose g is a function for which the two expectations E( g(X) (X − μ) ) and E( g ′(X) ) both exist (the existence of the expectation of any random variable is equivalent to the finiteness of the expectation of its absolute value). Then

E\bigl(g(X)(X-\mu)\bigr)=\sigma^2 E\bigl(g'(X)\bigr).

In general, suppose X and Y are jointly normally distributed. Then

\operatorname{Cov}(g(X),Y)=E(g'(X)) \operatorname{Cov}(X,Y).

In order to prove the univariate version of this lemma, recall that the probability density function for the normal distribution with expectation 0 and variance 1 is

\varphi(x)={1 \over \sqrt{2\pi}}e^{-x^2/2}

and that for a normal distribution with expectation μ and variance σ2 is

{1\over\sigma}\varphi\left({x-\mu \over \sigma}\right).

Then use integration by parts.