Covariance

From Wikipedia, the free encyclopedia

For the physics topics, see covariant transformation; about the mathematics example for groupoids, see covariance in special relativity; for the computer science topic see parameter covariance.

In probability theory and statistics, covariance is the measure of how much two random variables vary together (as distinct from variance, which measures how much a single variable varies). If two variables tend to vary together (that is, when one of them is above its expected value, then the other variable tends to be above its expected value too), then the covariance between the two variables will be positive.

On the other hand, if when one of them is above its expected value, the other variable tends to be below its expected value, then the covariance between the two variables will be negative.

The covariance between two real-valued random variables X and Y, with expected values E(X) = μ and E(Y) = ν is defined as

\operatorname{cov}(X, Y) = \operatorname{E}((X - \mu) (Y - \nu)), \,

where E is the expected value operator. This can also be written:

\operatorname{cov}(X, Y) = \operatorname{E}(X \cdot Y) - \mu \nu. \,

If X and Y are independent, then their covariance is zero. This follows because under independence,

E(X \cdot Y)=E(X) \cdot E(Y)=\mu\nu.

Recalling the second form of the covariance given above, and substituting, we get

\operatorname{cov}(X, Y) = \mu \nu - \mu \nu = 0.

The converse, however, is not true: if X and Y have covariance zero, they need not be independent.

The units of measurement of the covariance cov(X, Y) are those of X times those of Y. By contrast, correlation, which depends on the covariance, is a dimensionless measure of linear dependence.

Random variables whose covariance is zero are called uncorrelated.

Contents

[edit] Properties

If X, Y are real-valued random variables and a, b are constant ("constant" in this context means non-random), then the following facts are a consequence of the definition of covariance:

\operatorname{cov}(X, X) = \operatorname{var}(X)\,
\operatorname{cov}(X, Y) = \operatorname{cov}(Y, X)\,
\operatorname{cov}(aX, bY) = ab\, \operatorname{cov}(X, Y)\,

For sequences X1, ..., Xn and Y1, ..., Ym of random variables, we have

\operatorname{cov}\left(\sum_{i=1}^n {X_i}, \sum_{j=1}^m{Y_j}\right) =    \sum_{i=1}^n{\sum_{j=1}^m{\operatorname{cov}\left(X_i, Y_j\right)}}.\,

For a sequence X1, ..., Xn of random variables, we have

\operatorname{var}\left(\sum_{i=1}^n X_i \right) = \sum_{i=1}^n \operatorname{var}(X_i) + 2\sum_{i,j\,:\,i<j} \operatorname{cov}(X_i,X_j).

[edit] Relationship to inner products

Many of the properties of covariance can be extracted elegantly by observing that it satisfies the abstract properties of an inner product:

(1) bilinear: for constants a and b and random variables X, Y, and U, Cov(aX + bY, U) = a Cov(X, U) + bCov(Y, U)
(2) symmetric: Cov(X, Y) = Cov(Y, X)
(3) positive definite: Var(X) = Cov(X, X) ≥ 0, and Cov(X, X) = 0 -> X is a constant random variable (K).

It follows that covariance is an inner product over a vector space of "random variables", with a(X) = (aX) and X + Y = (X + Y). "Random variables" is in quotes because it is not true that X + K is distributed the same as X for any constant K; but as long as these three basic properties of covariance apply, the duals of theorems regarding inner products that depend only on those properties will be valid.

[edit] Covariance matrices

For column-vector valued random variables X and Y with respective expected values μ and ν, and respective scalar components m and n, the covariance is defined to be the m×n matrix

\operatorname{cov}(X, Y) = \operatorname{E}((X-\mu)(Y-\nu)^\top).\,

For vector-valued random variables, cov(X, Y) and cov(Y, X) are each other's transposes.

The covariance is sometimes called a measure of "linear dependence" between the two random variables. That phrase does not mean the same thing that it means in a more formal linear algebraic setting (see linear dependence), although that meaning is related. The correlation is a closely related concept used to measure the degree of linear dependence between two variables.

[edit] See also

Look up covariance in Wiktionary, the free dictionary.