Quadratic form (statistics)
From Wikipedia, the free encyclopedia
If ε is a vector of n random variables, and Λ is an n-dimensional square matrix, then the scalar quantity ε'Λε is known as a quadratic form in ε.
Contents |
[edit] Expectation
It can be shown that
where μ and Σ are the expected value and variance-covariance matrix of ε, respectively. This result only depends on the existence of μ and Σ; in particular, normality of ε is not required.
[edit] Variance
In general, the variance of a quadratic form depends greatly on the distribution of ε. However, if ε does follow a multivariate normal distribution, the variance of the quadratic form becomes particularly tractable. Assume for the moment that Λ is a symmetric matrix. Then,
In fact, this can be generalized to find the covariance between two quadratic forms on the same ε (once again, Λ1 and Λ2 must both be symmetric):
[edit] Computing the variance in the non-symmetric case
Some texts incorrectly state the above variance or covariance results without enforcing Λ to be symmetric. The case for general Λ can be derived by noting that
- ε'Λ'ε = ε'Λε
so
But this is a quadratic form in the symmetric matrix , so the mean and variance expressions are the same, provided Λ is replaced by therein.
[edit] Examples of quadratic forms
In the setting where one has a set of observations y and an operator matrix H, then the residual sum of squares can be written as a quadratic form in y:
For procedures where the matrix H is symmetric and idempotent, and the errors are Gaussian with covariance matrix σ2I, RSS / σ2 has a chi-square distribution with k degrees of freedom and noncentrality parameter λ, where
may be found by matching the first two central moments of a noncentral chi-square random variable to the expressions given in the first two sections. If Hy estimates μ with no bias, then the noncentrality λ is zero and RSS / σ2 follows a central chi-square distribution.