List of convolutions of probability distributions
From Wikipedia, the free encyclopedia
In probability theory, the probability distribution of the sum of two or more independent random variables is the convolution of their individual distributions. The term is motivated by the fact that the probability mass function or probability density function of a sum of random variables is the convolution of their corresponding probabilty mass functions or probability density functions respectively. Many well known distributions have simple convolutions. The following is a list of these convolutions. Each statement is of the form
where are independent and identically distributed random variables. In place of Xi and Y the names of the corresponding distributions and their parameters have been indicated.
Contents |
[edit] Discrete distributions
[edit] Continuous distributions
- .
[edit] Example proof
There are various ways to prove the above relations. A straightforward technique is to use the moment generating function, which is unique to a given distribution.
[edit] Proof that
The moment generating function of each Xi and of Z is
where t is within some neighborhood of zero.
The expectation of the product is the product of the expectations since each Xi is independent. Since Y and Z have the same moment generating function they must have the same distribution.
[edit] See also
- Uniform distribution
- Bernoulli distribution
- Binomial distribution
- Geometric distribution
- Negative binomial distribution
- Poisson distribution
- Exponential distribution
- Beta distribution
- Gamma distribution
- Chi-square distribution
- Normal distribution
[edit] References
- Craig, Allen T.; Robert V. Hogg, Joseph W. McKean (2005). Introduction to Mathematical Statistics, sixth edition, Pearson Prentice Hall. ISBN 0-13-008507-3.