Poisson distribution

Poisson
Probability mass function
Plot of the Poisson PMF
The horizontal axis is the index k. The function is only defined at integer values of k (empty lozenges). The connecting lines are only guides for the eye.
Cumulative distribution function
Plot of the Poisson CDF
The horizontal axis is the index k. The CDF is discontinuous at the integers of k and flat everywhere else because a variable that is Poisson distributed only takes on integer values.
Parameters \lambda \in [0,\infty)
Support k \in \{0,1,2,\ldots\}
Probability mass function (pmf) \frac{e^{-\lambda} \lambda^k}{k!}\!
Cumulative distribution function (cdf) \frac{\Gamma(\lfloor k+1\rfloor, \lambda)}{\lfloor k\rfloor�!}\!\text{ for }k\ge 0

(where \Gamma(x, y) is the Incomplete gamma function and \lfloor k\rfloor is the floor function)

Mean \lambda
Median \text{usually about }\lfloor\lambda+1/3-0.02/\lambda\rfloor
Mode \lfloor\lambda\rfloor and \lambda-1 if \lambda is an integer
Variance \lambda
Skewness \lambda^{-1/2}\,
Excess kurtosis \lambda^{-1}\,
Entropy \lambda[1\!-\!\log(\lambda)]\!+\!e^{-\lambda}\sum_{k=0}^\infty \frac{\lambda^k\log(k!)}{k!}

(for large \lambda) \frac{1}{2}\log(2 \pi e \lambda) - \frac{1}{12 \lambda} - \frac{1}{24 \lambda^2} - \frac{19}{360 \lambda^3} + O(\frac{1}{\lambda^4})

Moment-generating function (mgf) \exp(\lambda (e^t-1))\,
Characteristic function \exp(\lambda (e^{it}-1))\,

In probability theory and statistics, the Poisson distribution is a discrete probability distribution that expresses the probability of a number of events occurring in a fixed period of time if these events occur with a known average rate and independently of the time since the last event. The Poisson distribution can also be used for the number of events in other specified intervals such as distance, area or volume.

The distribution was discovered by Siméon-Denis Poisson (1781–1840) and published, together with his probability theory, in 1838 in his work Recherches sur la probabilité des jugements en matières criminelles et matière civile ("Research on the Probability of Judgments in Criminal and Civil Matters"). The work focused on certain random variables N that count, among other things, a number of discrete occurrences (sometimes called "arrivals") that take place during a time-interval of given length. If the expected number of occurrences in this interval is λ, then the probability that there are exactly k occurrences (k being a non-negative integer, k = 0, 1, 2, ...) is equal to

f(k; \lambda)=\frac{\lambda^k e^{-\lambda}}{k!},\,\!

where

As a function of k, this is the probability mass function. The Poisson distribution can be derived as a limiting case of the binomial distribution.

The Poisson distribution can be applied to systems with a large number of possible events, each of which is rare. A classic example is the nuclear decay of atoms.

The Poisson distribution is sometimes called a Poissonian, analogous to the term Gaussian for a Gauss or normal distribution.

Contents

Poisson noise and characterizing small occurrences

The parameter λ is not only the mean number of occurrences \scriptstyle\langle k \rangle, but also its variance \scriptstyle\sigma_k^2 \ =\   \langle k^{2} \rangle - \langle k \rangle^{2} (see Table). Thus, the number of observed occurrences fluctuates about its mean λ with a standard deviation \scriptstyle\sigma_{k}\, =\, \sqrt{\lambda}. These fluctuations are denoted as Poisson noise or (particularly in electronics) as shot noise.

The correlation of the mean and standard deviation in counting independent, discrete occurrences is useful scientifically. By monitoring how the fluctuations vary with the mean signal, one can estimate the contribution of a single occurrence, even if that contribution is too small to be detected directly. For example, the charge e on an electron can be estimated by correlating the magnitude of an electric current with its shot noise. If N electrons pass a point in a given time t on the average, the mean current is I = eN / t; since the current fluctuations should be of the order \scriptstyle\sigma_{I} = e\sqrt{N/t\  } (i.e. the standard deviation of the Poisson process), the charge e can be estimated from the ratio \scriptstyle\sigma_{I}^{2}/I. An everyday example is the graininess that appears as photographs are enlarged; the graininess is due to Poisson fluctuations in the number of reduced silver grains, not to the individual grains themselves. By correlating the graininess with the degree of enlargement, one can estimate the contribution of an individual grain (which is otherwise too small to be seen unaided). Many other molecular applications of Poisson noise have been developed, e.g., estimating the number density of receptor molecules in a cell membrane.

\Pr(N_t=k)=f(k;\lambda t)=\frac{e^{-\lambda t} (\lambda t)^k}{k!}\,\!

Related distributions

F_\mathrm{Poisson}(x;\lambda) \approx F_\mathrm{normal}(x;\mu=\lambda,\sigma^2=\lambda)\,

Occurrence

The Poisson distribution arises in connection with Poisson processes. It applies to various phenomena of discrete nature (that is, those that may happen 0, 1, 2, 3, ... times during a given period of time or in a given area) whenever the probability of the phenomenon happening is constant in time or space. Examples of events that may be modelled as a Poisson distribution include:

[Note: the intervals between successive Poisson events are reciprocally-related, following the Exponential distribution. For example, the lifetime of a lightbulb, or waiting time between buses.]

How does this distribution arise? — The law of rare events

In several of the above examples—for example, the number of mutations in a given sequence of DNA—the events being counted are actually the outcomes of discrete trials, and would more precisely be modelled using the binomial distribution. However, the binomial distribution with parameters n and λ/n, i.e., the probability distribution of the number of successes in n trials, with probability λ/n of success on each trial, approaches the Poisson distribution with expected value λ as n approaches infinity. This provides a means by which to approximate random variables using the Poisson distribution rather than the more-cumbersome binomial distribution.

This limit is sometimes known as the law of rare events, since each of the individual Bernoulli events rarely triggers. The name may be misleading because the total count of success events in a Poisson process need not be rare if the parameter λ is not small. For example, the number of telephone calls to a busy switchboard in one hour follows a Poisson distribution with the events appearing frequent to the operator, but they are rare from the point of the average member of the population who is very unlikely to make a call to that switchboard in that hour.

The proof may proceed as follows. First, recall from calculus

\lim_{n\to\infty}\left(1-{\lambda \over n}\right)^n=e^{-\lambda},

and the definition of the Binomal distribution

P(X=k)={n \choose k} p^k (1-p)^{n-k}.

If the binomial probability can be defined such that p = \lambda/n, we can evaluate the limit of P as n goes large

\lim_{n\to\infty} P(X=k)=\lim_{n\to\infty}{n \choose k} p^k (1-p)^{n-k}
=\lim_{n\to\infty}{n! \over (n-k)!k!} \left({\lambda \over n}\right)^k \left(1-{\lambda\over n}\right)^{n-k}
=\lim_{n\to\infty}
\underbrace{\left[\frac{n!}{n^k\left(n-k\right)!}\right]}_F
\left(\frac{\lambda^k}{k!}\right)
\underbrace{\left(1-\frac{\lambda}{n}\right)^n}_{\approx\exp\left(-\lambda\right)}
\underbrace{\left(1-\frac{\lambda}{n}\right)^{-k}}_{\approx 1} =
F\exp\left(-\lambda\right)\left(\frac{\lambda^k}{k!}\right)

To evaluate the F term, we first take its logarithm


\lim_{n\to\infty} \log\left(F\right) =
\log\left(n!\right) - k\log\left(n\right) - \log\left[\left(n-k\right)!\right].

Using Stirling's approximation


\lim_{n\to\infty} \log\left(n!\right) \approx n\log\left(n\right) - n,

the expression for \log\left(F\right) can be further simplified to


\lim_{n\to\infty} \log\left(F\right) =
\left[n\log\left(n\right) - n\right] -
\left[k\log\left(n\right)\right] -
\left[\left(n-k\right)\log\left(n-k\right)-\left(n-k\right)\right]

= \left(n-k\right)\log\left(\frac{n}{n-k}\right) - k

= \underbrace{-\left(1-\frac{k}{n}\right)}_{\approx -1}
 \underbrace{\log\left(1-\frac{k}{n}\right)^n}_{=-k}
 - k
= k - k = 0.

Therefore \lim_{n\to\infty}F = \exp\left(0\right) = 1.

Consequently, the limit of the distribution becomes

{\lambda^k \exp\left(-\lambda\right) \over k!},

which now assumes the Poisson distribution.

More generally, whenever a sequence of independent binomial random variables with parameters n and pn is such that

\lim_{n\rightarrow\infty} np_n = \lambda,

the sequence converges in distribution to a Poisson random variable with mean λ (see, e.g. law of rare events).

Properties

If X_i \sim \mathrm{Pois}(\lambda_i)\, follow a Poisson distribution with parameter \lambda_i\, and X_i are independent, then Y = \sum_{i=1}^N X_i \sim \mathrm{Pois}\left(\sum_{i=1}^N \lambda_i\right)\, also follows a Poisson distribution whose parameter is the sum of the component parameters.
\mathrm{E}\left(e^{tX}\right)=\sum_{k=0}^\infty e^{tk} f(k;\lambda)=\sum_{k=0}^\infty e^{tk} {\lambda^k e^{-\lambda} \over k!} =e^{\lambda(e^t-1)}.
D_{KL}(\lambda\|\lambda_0) = \lambda_0 - \lambda + \lambda \log \frac{\lambda}{\lambda_0}.

Generating Poisson-distributed random variables

A simple way to generate random Poisson-distributed numbers is given by Knuth, see References below.

algorithm poisson random number (Knuth):
    init:
         Let L ← e−λ, k ← 0 and p ← 1.
    do:
         k ← k + 1.
         Generate uniform random number u in [0,1] and let p ← p × u.
    while p ≥ L.
    return k − 1. 

While simple, the complexity is linear in λ. There are many other algorithms to overcome this. Some are given in Ahrens & Dieter, see References below.

Parameter estimation

Maximum likelihood

Given a sample of n measured values ki we wish to estimate the value of the parameter λ of the Poisson population from which the sample was drawn. To calculate the maximum likelihood value, we form the log-likelihood function

L(\lambda) = \log \prod_{i=1}^n f(k_i \mid \lambda) \!
= \sum_{i=1}^n \log\!\left(\frac{e^{-\lambda}\lambda^{k_i}}{k_i!}\right) \!
= -n\lambda + \left(\sum_{i=1}^n k_i\right) \log(\lambda) - \sum_{i=1}^n \log(k_i!). \!

Take the derivative of L with respect to λ and equate it to zero:

\frac{\mathrm{d}}{\mathrm{d}\lambda} L(\lambda) = 0
\iff -n + \left(\sum_{i=1}^n k_i\right) \frac{1}{\lambda} = 0. \!

Solving for λ yields the maximum-likelihood estimate of λ:

\widehat{\lambda}_\mathrm{MLE}=\frac{1}{n}\sum_{i=1}^n k_i. \!

Since each observation has expectation λ so does this sample mean. Therefore it is an unbiased estimator of λ. It is also an efficient estimator, i.e. its estimation variance achieves the Cramér-Rao lower bound (CRLB).

Bayesian inference

In Bayesian inference, the conjugate prior for the rate parameter λ of the Poisson distribution is the Gamma distribution. Let

\lambda \sim \mathrm{Gamma}(\alpha, \beta) \!

denote that λ is distributed according to the Gamma density g parameterized in terms of a shape parameter α and an inverse scale parameter β:

 g(\lambda \mid \alpha,\beta) = \frac{\beta^{\alpha}}{\Gamma(\alpha)} \; \lambda^{\alpha-1} \; e^{-\beta\,\lambda} \qquad \mbox{for}\ \lambda>0 \,\!.

Then, given the same sample of n measured values ki as before, and a prior of Gamma(α, β), the posterior distribution is

\lambda \sim \mathrm{Gamma}(\alpha + \sum_{i=1}^n k_i, \beta + n). \!

The posterior mean E[λ] approaches the maximum likelihood estimate \widehat{\lambda}_\mathrm{MLE} in the limit as \alpha\to 0,\ \beta\to 0.

The posterior predictive distribution of additional data is a Gamma-Poisson (i.e. negative binomial) distribution.

The "law of small numbers"

The word law is sometimes used as a synonym of probability distribution, and convergence in law means convergence in distribution. Accordingly, the Poisson distribution is sometimes called the law of small numbers because it is the probability distribution of the number of occurrences of an event that happens rarely but has very many opportunities to happen. The Law of Small Numbers is a book by Ladislaus Bortkiewicz about the Poisson distribution, published in 1898. Some historians of mathematics have argued that the Poisson distribution should have been called the Bortkiewicz distribution.[3]

See also

Online Visualization Tools

Notes

  1. NIST/SEMATECH, '6.3.3.1. Counts Control Charts', e-Handbook of Statistical Methods, accessed 25 October 2006
  2. McCullagh, Peter; Nelder, John (1989). Generalized Linear Models. London: Chapman and Hall. ISBN 0-412-31760-5.  page 196 gives the approximation and the subsequent terms.
  3. p.e. I J Good, Some statistical applications of Poisson's work, Statist. Sci. 1 (2) (1986), 157-180. JSTOR link

References

External links

`