Chi-square distribution
From Wikipedia, the free encyclopedia
- This article is about the mathematics of the chi-square distribution. For its uses in statistics, see chi-square test.
Probability density function |
|
Cumulative distribution function |
|
Parameters | degrees of freedom |
---|---|
Support | |
Probability density function (pdf) | |
Cumulative distribution function (cdf) | |
Mean | |
Median | approximately |
Mode | if |
Variance | |
Skewness | |
Excess Kurtosis | |
Entropy | |
mgf | for |
Char. func. |
In probability theory and statistics, the chi-square distribution (also chi-squared or χ2 distribution) is one of the theoretical probability distributions most widely used in inferential statistics, i.e. in statistical significance tests. It is useful because, under reasonable assumptions, easily calculated quantities can be proven to have distributions that approximate to the chi-square distribution if the null hypothesis is true.
If Xi are k independent, normally distributed random variables with means μi and variances , then the random variable
is distributed according to the chi-square distribution. This is usually written
The chi-square distribution has one parameter: k - a positive integer that specifies the number of degrees of freedom (i.e. the number of Xi)
The chi-square distribution is a special case of the gamma distribution.
The best-known situations in which the chi-square distribution is used are the common chi-square tests for goodness of fit of an observed distribution to a theoretical one, and of the independence of two criteria of classification of qualitative data. However, many other statistical tests lead to a use of this distribution. One example is Friedman's analysis of variance by ranks.
Contents |
[edit] Properties
The chi-square probability density function is
where and f(x;k) = 0 for . Here Γ denotes the Gamma function. The cumulative distribution function is:
where γ(k,z) is the incomplete Gamma function.
Tables of this distribution — usually in its cumulative form — are widely available (see the External links below for online versions), and the function is included in many spreadsheets and all statistical packages.
If p independent linear homogeneous constraints are imposed on these variables, the distribution of X conditional on these constraints is , justifying the term "degrees of freedom". The characteristic function of the Chi-square distribution is
The chi-square distribution has numerous applications in inferential statistics, for instance in chi-square tests and in estimating variances. It enters the problem of estimating the mean of a normally distributed population and the problem of estimating the slope of a regression line via its role in Student's t-distribution. It enters all analysis of variance problems via its role in the F-distribution, which is the distribution of the ratio of two independent chi-squared random variables divided by their respective degrees of freedom.
[edit] The normal approximation
If , then as k tends to infinity, the distribution of X tends to normality. However, the tendency is slow (the skewness is and the kurtosis is 12 / k) and two transformations are commonly considered, each of which approaches normality faster than X itself:
Fisher showed that is approximately normally distributed with mean and unit variance.
Wilson and Hilferty showed in 1931 that is approximately normally distributed with mean 1 − 2 / (9k) and variance 2 / (9k).
The expected value of a random variable having chi-square distribution with k degrees of freedom is k and the variance is 2k. The median is given approximately by
Note that 2 degrees of freedom leads to an exponential distribution.
The information entropy is given by
where ψ(x) is the Digamma function.
[edit] Related distributions
- is an exponential distribution (where λ is a survival parameter) if (with 2 degrees of freedom).
- is a chi-square distribution if for independent that are normally distributed. If the have nonzero means, then is drawn from a noncentral chi-square distribution.
- is an F-distribution if where and are independent with their respective degrees of freedom.
- is a chi-square distribution if where are independent and .
- if X is chi-square distributed, then is chi distributed.
- if are i.i.d. N(μ,σ2) random variables, then where .
Name | Statistic |
---|---|
chi-square distribution | |
noncentral chi-square distribution | |
chi distribution | |
noncentral chi distribution |
[edit] See also
- Cochran's theorem
- Inverse-chi-square distribution
- Degrees of freedom (statistics)
- Fisher's method for combining independent tests of significance
[edit] External links
- SixSigmaFirst, On line tutorials for Six Sigma and Statistics
- On-line calculator for the significance of chi-square, in Richard Lowry's statistical website at Vassar College.
- Distribution Calculator Calculates probabilities and critical values for normal, t-, chi2- and F-distribution
- Chi-Square Calculator for critical values of Chi-Square in R. Webster West's applet website at University of South Carolina
- Chi-Square Calculator from GraphPad
- Chi-square tutorial