Stability (probability)

From Wikipedia, the free encyclopedia

In probability theory and statistics, the stability of a family of probability distributions is an important property which basically states that if one has a number of random variates that are "in the family", any linear combination of these variates will also be "in the family". Specifically, the family of probability distributions here is a location-scale family, consisting of probability distributions that differ only in location and scale and "in the family" means that the random variates have a distribution function that is a member of the family.

The importance of a stable family of probability distributions is that they serve as "attractors" for linear combinations of non-stable random variates. The most noted example is the normal distribution which is one family of stable distributions. By the classical central limit theorem the linear sum of a set of random variates, each with finite variance, will tend towards a normal distribution as the number of variates increases.

Another family of stable distributions is represented by the Cauchy distribution. In this case the generalization of the central limit theorem (due to Gnedenko and Kolmogorov) states that the linear combination of a sum of random variates whose cumulative distribution function falls off as 1 / x will tend to a Cauchy distribution.

Finally, all continuous stable distributions can be specified by the proper choice of α and β in the Levy skew alpha-stable distribution. Again, the general central limit theorem states that the linear combination of a sum of random variates whose cumulative distribution function falls off as 1 / xα will tend to a Levy skew alpha-stable distribution with that value of α and β = 0

Contents

[edit] Definition

A random variable represents the possible outcomes of a set of events or a process. If it is a real-valued random variable, for any particular instance this value will be a real number. Let's restrict ourselves to continuous distributions. (The results may be easily extended to discrete distributions.) The probability that the value will be between x and x + dx will be given by the probability density function (PDF)

\textrm{Prob}(x<X<x+dx)=f(x)\,dx

Given a particular distribution, we can define a family of such distributions all related by a shift and scale. Suppose we pick a probability density function f(x) as the "standard" distribution. Then any other distribution function which is a member of the family may be written as:

f(x;\mu,c) = f\left(\frac{x-\mu}{c}\right)\,

In other words, any distribution function f(x;μ,c) is just f(x) multiplied by the scale factor c and shifted by the shift factor μ. if the name of this family of distribution functions is "Fam" then the notation

X \sim \textrm{Fam}(\mu,c)

means that the distribution function for X is a member of the family "Fam". Suppose X1 and X2 are random variables which have distribution functions from the same family:

X_1 \sim \textrm{Fam}(\mu_1,c_1)
X_2 \sim \textrm{Fam}(\mu_2,c_2)

We can form a new random variable Y as a linear combination of X1 and X2:

Y=aX_1+bX_2.\,

Y will now have its own distribution function. The distribution family of X1 and X2 is said to be stable if the distribution function for Y is a member of that same family to within a constant. In other words there exists some μ and c such that

Y \sim \textrm{Fam}(\mu,c)+K

where K is a constant. If the constant K is always zero for any values of μ1, μ2, c1 and c2, then the distribution family is said to be strictly stable.

[edit] Calculating the PDF for the linear combination

To determine if a family is stable, we need to be able to calculate the PDF for the Y variable. The probability that Y takes on a value from y to y + dy is the integral of the probability that X1 has value x1 + dx1 and X2 has value x2 to dx2 constrained by the requirement that y = ax1 + bx2. In other words, its a convolution:

f(y;\mu,c) = \int_{-\infty}^\infty f(x_1;\mu_1,c_1)f((y-ax_1)/b;\mu_2,c_2)\,dx_1 \!

A simple way of checking if a distribution family is stable is by using the characteristic function of the distribution. The characteristic function of a particular distribution is just the Fourier transform of the PDF. Since the Fourier transform of a convolution of two functions is the product of the Fourier transform of each function, this means that if \varphi(t;\mu,c) is the characteristic function of the family of distributions, then

\varphi(t;\mu,c)=\varphi(at;\mu_1,c_1)\varphi(bt;\mu_2,c_2)

[edit] Examples

The most familiar stable distribution is the normal distribution with PDF

f(x;\mu,\sigma^2)=\frac1{\sqrt{2\pi\sigma^2}}\; \exp\left(-\frac{\left(x-\mu\right)^2}{2\sigma^2} \right)

and characteristic function:

\varphi(t;\mu,\sigma^2)=\exp\left(i\mu t-\frac{\sigma^2 t^2}{2}\right)

Note that the stability property can be immediately seen by noting that the product of two characteristic functions is of the same form:

\varphi(at;\mu_1,\sigma_1^2)\varphi(bt;\mu_2,\sigma_2^2) =  \exp\left(i\mu t-\frac{\sigma^2 t^2}{2}\right)

where

\mu=a\mu_1+b\mu_2\,
\sigma^2=a^2\sigma_1^2+b^2\sigma_2^2

[edit] Relationships for μ and c

The most general stable distribution is the Levy skew alpha-stable distribution (Lévy SαS distribution) of which the normal distribution is of course a special case. Two other special cases are expressible in closed form: the Levy distribution and the Cauchy distribution. A Lévy SαS distribution is generally only known by its characteristic function:

\varphi(t;\alpha,\beta,c,\mu) =  \exp\left[~it\mu\!-\!|c t|^\alpha\,(1\!-\!i \beta\,\textrm{sign}(t)\Phi(t))~\right]

As in the above example, it can be seen that if X1 and X2 are distributed according to two Lévy SαS distributions with like α and β, then the distribution of Y will be characterized by a Lévy SαS distribution with:

\mu = a\mu_1+b\mu_2\,
c^\alpha = (ac_1)^\alpha+(bc_2)^\alpha\,

Any Lévy SαS distribution with like α and β will form a family of stable distributions. Furthermore, any family of stable distributions will be a special case of the Lévy SαS distribution with a particular α and β. Since the Lévy SαS distribution is general, the above relationships for μ and c are general statements that apply to any stable distribution.

[edit] External links and references

  • Information on stable distributions. Retrieved on July 13, 2005. - John P. Nolan's introduction to stable distributions, some papers on stable laws, and a free program to compute stable densities, cumulative distribution functions, quantiles, estimate parameters, etc.