List of convolutions of probability distributions

From Wikipedia, the free encyclopedia

In probability theory, the probability distribution of the sum of two or more independent random variables is the convolution of their individual distributions. The term is motivated by the fact that the probability mass function or probability density function of a sum of random variables is the convolution of their corresponding probabilty mass functions or probability density functions respectively. Many well known distributions have simple convolutions. The following is a list of these convolutions. Each statement is of the form

\sum_{i=1}^n X_i \sim Y

where X_1, X_2,\dots, X_n\, are independent and identically distributed random variables. In place of Xi and Y the names of the corresponding distributions and their parameters have been indicated.

Contents

[edit] Discrete distributions

  • \sum_{i=1}^n \mathrm{Bernoulli}(p) \sim \mathrm{Binomial}(n,p) \qquad 0<p<1 \quad n=1,2,\dots \,\!
  • \sum_{i=1}^n \mathrm{Binomial}(n_i,p) \sim \mathrm{Binomial}(\sum_{i=1}^n n_i,p) \qquad 0<p<1 \quad n_i=1,2,\dots \,\!
  • \sum_{i=1}^n \mathrm{NegativeBinomial}(n_i,p) \sim \mathrm{NegativeBinomial}(\sum_{i=1}^n n_i,p) \qquad 0<p<1 \quad n_i=1,2,\dots \,\!
  • \sum_{i=1}^n \mathrm{Geometric}(p) \sim \mathrm{NegativeBinomial}(n,p) \qquad 0<p<1 \quad n=1,2,\dots \,\!
  • \sum_{i=1}^n \mathrm{Poisson}(\lambda_i) \sim \mathrm{Poisson}(\sum_{i=1}^n \lambda_i) \qquad \lambda_i>0 \,\!

[edit] Continuous distributions

  • \sum_{i=1}^n \mathrm{Normal}(\mu_i,\sigma_i^2) \sim \mathrm{Normal}(\sum_{i=1}^n \mu_i, \sum_{i=1}^n \sigma_i^2) \qquad -\infty<\mu_i<\infty \quad \sigma_i^2>0
  • \sum_{i=1}^n \mathrm{Gamma}(\alpha_i,\beta) \sim \mathrm{Gamma}(\sum_{i=1}^n \alpha_i,\beta) \qquad \alpha_i>0  \quad \beta>0
  • \sum_{i=1}^n \mathrm{Exponential}(\theta) \sim \mathrm{Gamma}(n,\theta) \qquad \theta>0 \quad n=1,2,\dots
  • \sum_{i=1}^n \chi^2(r_i) \sim \chi^2(\sum_{i=1}^n r_i) \qquad r_i=1,2,\dots
  • \sum_{i=1}^r N^2(0,1) \sim \chi^2_r \qquad r=1,2,\dots
  • \sum_{i=1}^n(X_i - \bar X)^2 \sim \sigma^2 \chi^2_{n-1} \qquad \mathrm{where} \quad X_i \sim N(\mu,\sigma^2) \quad \mathrm{and} \quad \bar X = \frac{1}{n} \sum_{i=1}^n X_i \,\!.

[edit] Example proof

There are various ways to prove the above relations. A straightforward technique is to use the moment generating function, which is unique to a given distribution.

[edit] Proof that \sum_{i=1}^n \mathrm{Bernoulli}(p) \sim \mathrm{Binomial}(n,p)

X_i \sim \mathrm{Bernoulli}(p) \quad 0<p<1 \quad 1 \le i \le n
Y=\sum_{i=1}^n X_i
Z \sim \mathrm{Binomial}(n,p) \,\!

The moment generating function of each Xi and of Z is

M_{X_i}(t)=1-p+pe^t \qquad M_Z(t)=(1-p+pe^t)^n

where t is within some neighborhood of zero.

M_Y(t)=E(e^{t\sum_{i=1}^n X_i})=E(\prod_{i=1}^n e^{tX_i})=\prod_{i=1}^n E(e^{tX_i})
=\prod_{i=1}^n (1-p+pe^t)=(1-p+pe^t)^n=M_Z(t)

The expectation of the product is the product of the expectations since each Xi is independent. Since Y and Z have the same moment generating function they must have the same distribution.

[edit] See also

[edit] References

  • Craig, Allen T.; Robert V. Hogg, Joseph W. McKean (2005). Introduction to Mathematical Statistics, sixth edition, Pearson Prentice Hall. ISBN 0-13-008507-3.