Probability mass function

From Wikipedia, the free encyclopedia

In probability theory, a probability mass function (abbreviated pmf) gives the probability that a discrete random variable is exactly equal to some value. A probability mass function differs from a probability density function in that the values of the latter, defined only for continuous random variables, are not probabilities; rather, its integral over a set of possible values of the random variable is a probability.

[edit] Mathematical description

Suppose that X is a discrete random variable, taking values on some countable sample space  SR. Then the probability mass function  fX(x)  for X is given by

f_X(x) = \begin{cases} \Pr(X = x), &x\in S,\\0, &x\in \mathbb{R}\backslash S.\end{cases}

Note that this explicitly defines  fX(x)  for all real numbers, including all values in R that X could never take; indeed, it assigns such values a probability of zero. (Alternatively, think of  Pr(X = x)  as 0 when  xR\S.)

The discontinuity of probability mass functions reflects the fact that the cumulative distribution function of a discrete random variable is also discontinuous. Where it is differentiable (i.e. where xR\S) the derivative is zero, just as the probability mass function is zero at all such points.

[edit] Examples

A simple example of a probability mass function is the following. Suppose that X is the outcome of a single coin toss, assigning 0 to tails and 1 to heads. The probability that X = x is just 0.5 on the state space {0, 1} (this is a Bernoulli random variable), and hence the probability mass function is

f_X(x) = \begin{cases}\frac{1}{2}, &x \in \{0, 1\},\\0, &x \in \mathbb{R}\backslash\{0, 1\}.\end{cases}

Probability mass functions may also be defined for any discrete random variable, including constant, binomial (including Bernoulli), negative binomial, Poisson, geometric and hypergeometric random variables.