Chi-square distribution

chi-square
Probability density function
Chi-square distributionPDF.png
Cumulative distribution function
Chi-square distributionCDF.png
Parameters k > 0\, degrees of freedom
Support x \in [0; +\infty)\,
Probability density function (pdf) \frac{(1/2)^{k/2}}{\Gamma(k/2)} x^{k/2 - 1} e^{-x/2}\,
Cumulative distribution function (cdf) \frac{\gamma(k/2,x/2)}{\Gamma(k/2)}\,
Mean k\,
Median approximately k-2/3\,
Mode k-2\, if k\geq 2\,
Variance 2\,k\,
Skewness \sqrt{8/k}\,
Excess kurtosis 12/k\,
Entropy \frac{k}{2}\!+\!\ln(2\Gamma(k/2))\!+\!(1\!-\!k/2)\psi(k/2)
Moment-generating function (mgf) (1-2\,t)^{-k/2} for 2\,t<1\,
Characteristic function (1-2\,i\,t)^{-k/2}\,

In probability theory and statistics, the chi-square distribution (also chi-squared or \chi^2  distribution) is one of the most widely used theoretical probability distributions in inferential statistics, e.g., in statistical significance tests.[1][2][3][4] It is useful because, under reasonable assumptions, easily calculated quantities can be proven to have distributions that approximate to the chi-square distribution if the null hypothesis is true.

The best-known situations in which the chi-square distribution are used are the common chi-square tests for goodness of fit of an observed distribution to a theoretical one, and of the independence of two criteria of classification of qualitative data. Many other statistical tests also lead to a use of this distribution, like Friedman's analysis of variance by ranks.

Contents

Definition

If X_i are k independent, normally distributed random variables with mean 0 and variance 1, then the random variable

Q = \sum_{i=1}^k X_i^2

is distributed according to the chi-square distribution with k degrees of freedom. This is usually written

Q\sim\chi^2_k.\,

The chi-square distribution has one parameter: k - a positive integer that specifies the number of degrees of freedom (i.e. the number of X_i)

The chi-square distribution is a special case of the gamma distribution.

Characteristics

Probability density function

A probability density function of the chi-square distribution is


f(x;k)=
\begin{cases}\displaystyle
\frac{1}{2^{k/2}\Gamma(k/2)}\,x^{(k/2) - 1} e^{-x/2}&\text{for }x>0,\\
0&\text{for }x\le0,
\end{cases}

where \Gamma denotes the Gamma function, which has closed-form values at the half-integers.

Cumulative distribution function

Its cumulative distribution function is:

F(x;k)=\frac{\gamma(k/2,x/2)}{\Gamma(k/2)} = P(k/2, x/2)

where \gamma(k,z) is the lower incomplete Gamma function and P(k, z) is the regularized Gamma function.

Tables of this distribution — usually in its cumulative form — are widely available and the function is included in many spreadsheets and all statistical packages.

Characteristic function

The characteristic function of the Chi-square distribution is

\chi(t;k)=(1-2it)^{-k/2}.\,

Expected value and variance

If X\sim\chi^2_k then

\mathrm{E}(X)=k
\mathrm{Var}(X)=2k

Median

The median of X\sim\chi^2_k is given approximately by

k-\frac{2}{3}+\frac{4}{27k}-\frac{8}{729k^2}.

Information entropy

The information entropy is given by


H
=
\int_{-\infty}^\infty f(x;k)\ln(f(x;k)) dx
=
\frac{k}{2}
+
\ln
 \left(
  2 \Gamma
  \left(
   \frac{k}{2}
  \right)
 \right)
+
\left(1 - \frac{k}{2}\right)
\psi(k/2).

where \psi(x) is the Digamma function.

Derivation of the pdf for one degree of freedom

Let  Y = X^2 where  X \sim N(0,1)

then P(Y<y) = P(X^2<y)=P(X<|\sqrt{y}|)=F_x(\sqrt{y})-F_x(-\sqrt{y})

    f_y(y)    = f_x(\sqrt{Y})\frac{\partial(\sqrt{y})}{\partial y}-f_x(-\sqrt{Y})\frac{\partial(-\sqrt{y})}{\partial y}             
              = \frac{1}{\sqrt{2\pi}}e^{\frac{-y}{2}}\frac{1}{2y^{1/2}} + \frac{1}{\sqrt{2\pi}}e^{\frac{-y}{2}}\frac{1}{2y^{1/2}} 
              = \frac{1}{2^{\frac{1}{2}} \Gamma(\frac{1}{2})}y^{\frac{1}{2} -1}e^{\frac{-y}{2}}                                   

Then  Y = X^2 \sim \chi^2(1)

Related distributions and properties

The chi-square distribution has numerous applications in inferential statistics, for instance in chi-square tests and in estimating variances. It enters the problem of estimating the mean of a normally distributed population and the problem of estimating the slope of a regression line via its role in Student's t-distribution. It enters all analysis of variance problems via its role in the F-distribution, which is the distribution of the ratio of two independent chi-squared random variables divided by their respective degrees of freedom.

Name Statistic
chi-square distribution \sum_{i=1}^k \frac{\left(X_i-\mu_i\right)^2}{\sigma_i^2}
noncentral chi-square distribution \sum_{i=1}^k \left(\frac{X_i}{\sigma_i}\right)^2
chi distribution \sqrt{\sum_{i=1}^k \left(\frac{X_i-\mu_i}{\sigma_i}\right)^2}
noncentral chi distribution \sqrt{\sum_{i=1}^k \left(\frac{X_i}{\sigma_i}\right)^2}

See also

References

  1. Abramowitz, Milton; Stegun, Irene A., eds. (1965), "Chapter 26", Handbook of Mathematical Functions with Formulas, Graphs, and Mathematical Tables, New York: Dover, ISBN 0-486-61272-4 .
  2. NIST (2006). Engineering Statistics Handbook - Chi-Square Distribution
  3. Jonhson, N.L.; S. Kotz, , N. Balakrishnan (1994). Continuous Univariate Distributions (Second Ed., Vol. 1, Chapter 18). John Willey and Sons. ISBN 0-471-58495-9. 
  4. Mood, Alexander; Franklin A. Graybill, Duane C. Boes (1974). Introduction to the Theory of Statistics (Third Edition, p. 241-246). McGraw-Hill. ISBN 0-07-042864-6. 

External links