Sum of normally distributed random variables

From Wikipedia, the free encyclopedia

In probability theory, if X and Y are independent random variables that are normally distributed, then X + Y is also normally distributed.

In more notation-laden style: if

X \sim N(\mu, \sigma^2)\,

and

Y \sim N(\nu, \tau^2)\,

and X and Y are independent, then

Z = X + Y \sim N(\mu + \nu, \sigma^2 + \tau^2).\,

[edit] Proofs

This proposition may be proved by any of several methods.

[edit] Proof using convolutions

By the total probability theorem, we have

f_Z(z) = \iint_{x\,y} f_{X,Y,Z}(x,y,z)\, dx\,dy

and since X and Y are independent, we get

\, f_Z(z) = \iint_{x\,y} f_X(x) f_Y(y) f_Z(z|x,y)\, dx\,dy.

But fZ(z|x,y) is trivially equal to

f_Z(z) = \iint_{x\,y} f_X(x) f_Y(y) \delta(z - (x+y))\, dx\,dy

where δ is Dirac's delta function. We substitute (z − x) for y

f_Z(z) = \int_{x} f_X(x) f_Y(z-x)\, dx

which we recognize as a convolution of fX with fY.

Therefore the probability density function of the sum of two independent random variables X and Y with probability density functions f and g is the convolution

(f*g)(x)=\int_{-\infty}^\infty f(u) g(x-u)\,du.\,

No generality is lost by assuming the two expected values μ and ν are zero. Thus the two densities are

f(x) = {1 \over \sigma\sqrt{2\pi}} \exp\left({-x^2 \over 2\sigma^2}\right) and g(x) = {1 \over \tau\sqrt{2\pi}} \exp\left({-x^2 \over 2\tau^2}\right).

The convolution is

[\mathrm{constant}]\cdot\int_{-\infty}^\infty  \exp\left({-u^2 \over 2\sigma^2}\right) \exp\left({-(x-u)^2 \over 2\tau^2}\right)\,du
=[\mathrm{constant}]\cdot\int_{-\infty}^\infty \exp\left({-(\tau^2 u^2 + \sigma^2(x-u)^2) \over 2\sigma^2 \tau^2} \right)\,du.

In simplifying this expression it saves some effort to recall this obvious fact that the context might later make easy to forget: The integral

\int_{-\infty}^\infty \exp(-(u-A)^2)\,du

actually does not depend on A. This is seen be a simple substitution: w = u − A, dw = du, and the bounds of integration remain −∞ and +∞.

Now we have

[\mathrm{constant}]\cdot\int_{-\infty}^\infty \exp\left({-(\tau^2 u^2 + \sigma^2(x-u)^2) \over 2\sigma^2 \tau^2} \right)\,du
=[\mathrm{constant}]\cdot\int_{-\infty}^\infty \exp\left({-(\tau^2+\sigma^2)(u-{\sigma^2 \over \sigma^2+\tau^2}x)^2 \over 2\sigma^2\tau^2} + {-x^2 \over 2(\sigma^2 + \tau^2)}\right) \,du
=[\mathrm{constant}]\cdot \exp\left({-x^2 \over 2(\sigma^2 + \tau^2)}\right) \cdot \int_{-\infty}^\infty \exp\left({-(\tau^2+\sigma^2)(u-{\sigma^2 \over \sigma^2+\tau^2}x)^2 \over 2\sigma^2\tau^2}\right) \,du
=[\mathrm{constant}]\cdot \exp\left({-x^2 \over 2(\sigma^2 + \tau^2)}\right) \cdot [\mathrm{constant}],

where "constant" in this context means not depending on x. The last integral does not depend on x because of the "obvious fact" mentioned above.

A probability density function that is a constant multiple of

\exp\left({-x^2 \over 2(\sigma^2 + \tau^2)}\right)

is the density of a normal distribution with variance σ2 + τ2. Although we did not explicitly develop the constant in this derivation, this is indeed the case.

[edit] Proof using characteristic functions

The characteristic function

\varphi_{X+Y}(t) = \operatorname{E}\left(e^{it(X+Y)}\right)\,

of the sum of two independent random variables X and Y is just the product of the two separate characteristic functions:

\varphi_X (t) = \operatorname{E}\left(e^{itX}\right)\,

and

\varphi_Y(t) = \operatorname{E}\left(e^{itY}\right)\,

of X and Y.

The characteristic function of the normal distribution with expected value μ and standard deviation σ2 is

\varphi_X(t) = \exp\left(it\mu - {\sigma^2 t^2 \over 2}\right).

So

\varphi_{X+Y}(t)=\varphi_X(t) \varphi_Y(t) =\exp\left(it\mu - {\sigma^2 t^2 \over 2}\right) \cdot \exp\left(it\nu - {\tau^2 t^2 \over 2}\right)
=\exp\left(it(\mu+\nu) - {(\sigma^2 + \tau^2) t^2 \over 2}\right).

This is the characteristic function of the normal distribution with expected value μ + ν and variance σ2 + τ2.

Finally, recall that no two distinct distributions can both have the same characteristic function, so the distribution of X + Y must be just this normal distribution.