Poisson distribution

From Wikipedia, the free encyclopedia

Poisson
Probability mass function
Plot of the Poisson PMF
The horizontal axis is the index k. The function is only non-zero at integer values of k. The connecting lines are only guides for the eye and do not indicate continuity.
Cumulative distribution function
Plot of the Poisson CDF
The horizontal axis is the index k.
Parameters \lambda \in (0,\infty)
Support k \in \{0,1,2,\ldots\}
Probability mass function (pmf) \frac{e^{-\lambda} \lambda^k}{k!}\!
Cumulative distribution function (cdf) \frac{\Gamma(k+1, \lambda)}{k!}\!
Mean \lambda\,
Median usually about \lfloor\lambda+1/3-0.2/\lambda\rfloor
Mode \lfloor\lambda\rfloor
Variance \lambda\,
Skewness \lambda^{-1/2}\,
Excess Kurtosis \lambda^{-1}\,
Entropy \lambda[1\!-\!\ln(\lambda)]\!+\!e^{-\lambda}\sum_{k=0}^\infty \frac{\lambda^k\ln(k!)}{k!}
mgf \exp(\lambda (e^t-1))\,
Char. func. \exp(\lambda (e^{it}-1))\,

In probability theory and statistics, the Poisson distribution is a discrete probability distribution. It expresses the probability of a number of events occurring in a fixed period of time if these events occur with a known average rate, and are independent of the time since the last event.

The distribution was discovered by Siméon-Denis Poisson (17811840) and published, together with his probability theory, in 1838 in his work Recherches sur la probabilité des jugements en matières criminelles et matière civile ("Research on the Probability of Judgments in Criminal and Civil Matters"). The work focused on certain random variables N that count, among other things, a number of discrete occurrences (sometimes called "arrivals") that take place during a time-interval of given length. The probability that there are exactly k occurrences (k being a non-negative integer, k = 0, 1, 2, ...) is

f(k;\lambda)=\frac{e^{-\lambda} \lambda^k}{k!},\,\!

where

  • e is the base of the natural logarithm (e = 2.71828...),
  • k! is the factorial of k,
  • λ is a positive real number, equal to the expected number of occurrences that occur during the given interval. For instance, if the events occur on average every 4 minutes, and you are interested in the number of events occurring in a 10 minute interval, you would use as model a Poisson distribution with λ = 10/4 = 2.5.

As a function of k, this is the probability mass function. The Poisson distribution can be derived as a limiting case of the binomial distribution.

The Poisson distribution is sometimes called a Poissonian, analagous to the term Gaussian for a Gauss or normal distribution.

Contents

[edit] Poisson noise and characterizing small occurrences

The parameter λ is not only the mean number of occurrences \langle k \rangle, but also its variance \sigma_{k}^{2} \ \stackrel{\mathrm{def}}{=}\   \langle k^{2} \rangle - \langle k \rangle^{2} (see Table). Thus, the number of observed occurrences fluctuates about its mean λ with a standard deviation \sigma_{k} = \sqrt{\lambda}. These fluctuations are denoted as Poisson noise or (particularly in electronics) as shot noise.

The correlation of the mean and standard deviation in counting independent, discrete occurrences is useful scientifically. By monitoring how the fluctuations vary with the mean signal, one can estimate the contribution of a single occurrence, even if that contribution is too small to be detected directly. For example, the charge e on an electron can be estimated by correlating the magnitude of an electric current with its shot noise. If N electrons pass a point in a given time t on the average, the mean current is I = eN / t; since the current fluctuations should be of the order \sigma_{I} = e\sqrt{N/t}, the charge e can be estimated from the ratio \sigma_{I}^{2}/I. An everyday example is the graininess that appears as photographs are enlarged; the graininess is due to Poisson fluctuations in the number of reduced silver grains, not to the individual grains themselves. By correlating the graininess with the degree of enlargement, one can estimate the contribution of an individual grain (which is otherwise too small to be seen unaided). Albert Einstein used Poisson noise to show that matter was composed of discrete atoms and to estimate Avogadro's number; he also used Poisson noise in treating blackbody radiation to demonstrate that electromagnetic radiation was composed of discrete photons. Many other molecular applications of Poisson noise have been developed, e.g., estimating the number density of receptor molecules in a cell membrane.

[edit] Poisson processes

Sometimes λ is taken to be the rate, i.e., the average number of occurrences per unit time. In that case, if Nt is the number of occurrences before time t then we have

\Pr(N_t=k)=f(k;\lambda t)=\frac{e^{-\lambda t} (\lambda t)^k}{k!},\,\!

and the waiting time T until the first occurrence is a continuous random variable with an exponential distribution (with parameter λ). This probability distribution may be deduced from the fact that

\Pr(T>t)=\Pr(N_t=0)=e^{-\lambda t}.\,

When time becomes involved, then we have a 1-dimensional Poisson process, which involves both the discrete Poisson-distributed random variables that count the number of arrivals in each time interval, and the continuous Erlang-distributed waiting times. There are also Poisson processes of dimension higher than 1.

[edit] Related distributions

  • If X_1 \sim \mathrm{Poi}(\lambda_1)\, and X_2 \sim \mathrm{Poi}(\lambda_2)\, are independent, and Y = X1 + X2, then the distribution of X1 conditional on Y = y is a binomial. Specifically, X_1|(Y=y) \sim \mathrm{Binom}(y, \lambda_1/(\lambda_1+\lambda_2))\,.
  • The Poisson distribution can be derived as a limiting case to the binomial distribution as the number of trials goes to infinity and the expected number of successes remains fixed. Therefore it can be used as an approximation of the binomial distribution if n is sufficiently large and p is sufficiently small. There is a rule of thumb stating that the Poisson distribution is a good approximation of the binomial distribution if n is at least 20 and p is smaller than or equal to 0.05. According to this rule the approximation is excellent if n\ge 100 and np\le 10. [1]
  • For sufficiently large values of \lambda\, (say \lambda\, > 1000), the normal distribution with mean \lambda\, and variance \lambda\, is an excellent approximation to the Poisson distribution. If \lambda\, is greater than about 10, then the normal distribution is a good approximation if an appropriate continuity correction is performed, i.e., P(X ≤ x), where (lower-case) x is a non-negative integer, is replaced by P(X ≤ x + 0.5).
f_\mathrm{Poisson}(x;\lambda) \approx f_\mathrm{normal}(x;\mu=\lambda,\sigma^2=\lambda)\,

[edit] Occurrence

The Poisson distribution arises in connection with Poisson processes. It applies to various phenomena of discrete nature (that is, those that may happen 0, 1, 2, 3, ... times during a given period of time or in a given area) whenever the probability of the phenomenon happening is constant in time or space. Examples of events that can be modelled as Poisson distributions include:

  • The number of cars that pass through a certain point on a road (sufficiently distant from traffic lights) during a given period of time.
  • The number of spelling mistakes one makes while typing a single page.
  • The number of phone calls at a call center per minute.
  • The number of times a web server is accessed per minute.
    • For instance, the number of edits per hour recorded on Wikipedia's Recent Changes page follows an approximately Poisson distribution.
  • The number of roadkill (animals killed) found per unit length of road.
  • The number of mutations in a given stretch of DNA after a certain amount of radiation.
  • The number of unstable nuclei that decayed within a given period of time in a piece of radioactive substance. The radioactivity of the substance will weaken with time, so the total time interval used in the model should be significantly less than the mean lifetime of the substance.
  • The number of pine trees per unit area of mixed forest.
  • The number of stars in a given volume of space.
  • The number of soldiers killed by horse-kicks each year in each corps in the Prussian cavalry. This example was made famous by a book of Ladislaus Josephovich Bortkiewicz (18681931).
  • The distribution of visual receptor cells in the retina of the human eye.
  • The number of V2 rocket attacks per area in England, according to the fictionalized account in Thomas Pynchon's Gravity's Rainbow.
  • The number of light bulbs that burn out in a certain amount of time.
  • The number of viruses that can infect a cell in cell culture.

[edit] How does this distribution arise? — The law of rare events

In several of the above examples—for example, the number of mutations in a given sequence of DNA—the events being counted are actually the outcomes of discrete trials, and would more precisely be modelled using the binomial distribution. However, the binomial distribution with parameters n and λ/n, i.e., the probability distribution of the number of successes in n trials, with probability λ/n of success on each trial, approaches the Poisson distribution with expected value λ as n approaches infinity. This limit is sometimes known as the law of rare events. It provides a means by which to approximate random variables using the Poisson distribution rather than the more-cumbersome binomial distribution.

Here are the details. First, recall from calculus that

\lim_{n\to\infty}\left(1-{\lambda \over n}\right)^n=e^{-\lambda}.

Let p = λ/n. Then we have

\lim_{n\to\infty} \Pr(X=k)=\lim_{n\to\infty}{n \choose k} p^k (1-p)^{n-k} =\lim_{n\to\infty}{n! \over (n-k)!k!} \left({\lambda \over n}\right)^k \left(1-{\lambda\over n}\right)^{n-k}
=\lim_{n\to\infty} \underbrace{\left({n \over n}\right)\left({n-1 \over n}\right)\left({n-2 \over n}\right) \cdots \left({n-k+1 \over n}\right)}\ \underbrace{\left({\lambda^k \over k!}\right)}\ \underbrace{\left(1-{\lambda \over n}\right)^n}\ \underbrace{\left(1-{\lambda \over n}\right)^{-k}}.

As n approaches ∞, the expression over the first underbrace approaches 1; the second remains constant since "n" does not appear in it at all; the third approaches e−λ; and the fourth expression approaches 1.

Consequently the limit is

{\lambda^k e^{-\lambda} \over k!}.\,\!

More generally, whenever a sequence of binomial random variables with parameters n and pn is such that

\lim_{n\rightarrow\infty} np_n = \lambda,

the sequence converges in distribution to a Poisson random variable with mean λ (see, e.g., law of rare events).

[edit] Properties

  • The mode of a Poisson-distributed random variable with non-integer λ is equal to \lfloor \lambda \rfloor, which is the largest integer less than or equal to λ. This is also written as floor(λ). When λ is a positive integer, the modes are λ and λ − 1.
  • Sums of Poisson-distributed random variables:
If X_i \sim \mathrm{Poi}(\lambda_i)\, follow a Poisson distribution with parameter \lambda_i\, and Xi are independent, then Y = \sum_{i=1}^N X_i \sim \mathrm{Poi}\left(\sum_{i=1}^N \lambda_i\right)\, also follows a Poisson distribution whose parameter is the sum of the component parameters.
\mathrm{E}\left(e^{tX}\right)=\sum_{k=0}^\infty e^{tk} f(k;\lambda)=\sum_{k=0}^\infty e^{tk} {\lambda^k e^{-\lambda} \over k!} =e^{\lambda(e^t-1)}.
  • All of the cumulants of the Poisson distribution are equal to the expected value λ. The nth factorial moment of the Poisson distribution is λn.

[edit] Generating Poisson-distributed random variables

A simple way to generate random Poisson-distributed numbers is given by Knuth, see References below.

algorithm poisson random number (Knuth):

    init:
         Let L ← e − λ, k ← 0 and p ← 1.
    do:
         k ← k + 1.
         Generate uniform random number u and let p ← pu.
    while p ≥ L
    return k - 1.

While simple, the complexity is linear in λ. There are many other algorithms to overcome this. Some are given Ahrens & Dieter, see References below.

[edit] Parameter estimation

[edit] Maximum likelihood

Given a sample of n measured values ki we wish to estimate the value of the parameter λ of the Poisson population from which the sample was drawn. To calculate the maximum likelihood value, we form the log-likelihood function

L(\lambda) = \log \prod_{i=1}^n f(k_i;\lambda) \!
= \sum_{i=1}^n \log\!\left(\frac{e^{-\lambda}\lambda^{k_i}}{k_i!}\right) \!
= -n\lambda + \left(\sum_{i=1}^n k_i\right) \log(\lambda) - \sum_{i=1}^n \log(k_i!). \!

Take the derivative of L with respect to λ and equate it to zero:

\frac{\mathrm{d}}{\mathrm{d}\lambda} L(\lambda) = 0 \iff -n + \left(\sum_{i=1}^n k_i\right) \frac{1}{\lambda} = 0 \!

Solving for λ yields the maximum-likelihood estimate of λ:

\widehat{\lambda}_\mathrm{MLE}=\frac{1}{n}\sum_{i=1}^n k_i. \!

Since each observation has expectation λ so does this sample mean. Therefore it is an unbiased estimator of λ. It is also an efficient estimator, i.e. its estimation variance achieves the Cramér-Rao lower bound (CRLB).

[edit] Bayesian inference

In Bayesian inference, the conjugate prior for the rate parameter λ of the Poisson distribution is the Gamma distribution. Let

\lambda \sim \mathrm{Gamma}(\alpha, \beta) \!

denote that λ is distributed according to the Gamma density g parameterized in terms of a shape parameter α and an inverse scale parameter β:

g(\lambda;\alpha,\beta) = \frac{\beta^{\alpha}}{\Gamma(\alpha)} \; \lambda^{\alpha-1} \; e^{-\beta\,\lambda} \qquad \mbox{for}\ \lambda>0 \,\!

Then, given the same sample of n measured values ki as before, and a prior of Gamma(α, β), the posterior distribution is

\lambda \sim \mathrm{Gamma}(\alpha + \sum_{i=1}^n k_i, \beta + n). \!

The posterior mean E[λ] approaches the maximum likelihood estimate \widehat{\lambda}_\mathrm{MLE} in the limit as \alpha\to 0,\ \beta\to 0.

The posterior predictive distribution of additional data is a Gamma-Poisson (i.e. negative binomial) distribution.

[edit] The "law of small numbers"

The word law is sometimes used as a synonym of probability distribution, and convergence in law means convergence in distribution. Accordingly, the Poisson distribution is sometimes called the law of small numbers because it is the probability distribution of the number of occurrences of an event that happens rarely but has very many opportunities to happen. The Law of Small Numbers is a book by Ladislaus Bortkiewicz about the Poisson distribution, published in 1898. Some historians of mathematics have argued that the Poisson distribution should have been called the Bortkiewicz distribution.

[edit] See also

  • Compound Poisson distribution
  • Poisson process
  • Erlang distribution which describes the waiting time until n events have occurred. For temporally distributed events, the Poisson distribution is the probability distribution of the number of events that would occur within a preset time, the Erlang distribution is the probability distribution of the amount of time until the nth event.
  • Skellam distribution, the distribution of the difference of two Poisson variates, not necessarily from the same parent distribution.
  • Incomplete gamma function used to calculate the CDF.
  • Dobinski's formula (on combinatorial interpretation of the moments of the Poisson distribution)
  • Corasaniti Distribution - relates RPM, or rocks per minute, to lagged portfolio alpha. Distribution exhibits severe negative skew.

[edit] References

  • Knuth D.E. The art of computer programming, Vol II: Seminumerical algorithms. Addison Wesley 1969.
  • Ahrens J.H. and Deiter U. Computer Methods for Sampling from Gamma, Beta,

Poisson and Binomial Distributions. Computing 12,223--246 (1974)

  1. ^ NIST/SEMATECH, '6.3.3.1. Counts Control Charts', e-Handbook of Statistical Methods, <http://www.itl.nist.gov/div898/handbook/pmc/section3/pmc331.htm> [accessed 25 October 2006]

[edit] External links

Image:Bvn-small.png Probability distributions [ view talk edit ]
Univariate Multivariate
Discrete: BernoullibinomialBoltzmanncompound PoissondegenerateGauss-Kuzmingeometrichypergeometriclogarithmicnegative binomialparabolic fractalPoissonRademacherSkellamuniformYule-SimonzetaZipfZipf-Mandelbrot Ewensmultinomial
Continuous: BetaBeta primeCauchychi-squareDirac delta functionErlangexponentialexponential powerFfadingFisher's zFisher-TippettGammageneralized extreme valuegeneralized hyperbolicgeneralized inverse GaussianHalf-LogisticHotelling's T-squarehyperbolic secanthyper-exponentialhypoexponentialinverse chi-squareinverse gaussianinverse gammaKumaraswamyLandauLaplaceLévyLévy skew alpha-stablelogisticlog-normalMaxwell-BoltzmannMaxwell speednormal (Gaussian)ParetoPearsonpolarraised cosineRayleighrelativistic Breit-WignerRiceStudent's ttriangulartype-1 Gumbeltype-2 GumbeluniformVoigtvon MisesWeibullWigner semicircleWilks' lambda DirichletKentmatrix normalmultivariate normalvon Mises-FisherWigner quasiWishart
Miscellaneous: Cantorconditionalexponential familyinfinitely divisiblelocation-scale familymarginalmaximum entropyphase-typeposteriorpriorquasisamplingsingular