Omitted-variable bias

In statistics, omitted-variable bias (OVB) occurs when a model is created which incorrectly leaves out one or more important factors. The "bias" is created when the model compensates for the missing factor by over- or underestimating the effect of one of the other factors.

More specifically, OVB is the bias that appears in the estimates of parameters in a regression analysis, when the assumed specification is incorrect in that it omits an independent variable that is correlated with both the dependent variable and one or more included independent variables.

Omitted-variable bias in linear regression

Intuition

Two conditions must hold true for omitted-variable bias to exist in linear regression:

Suppose the true cause-and-effect relationship is given by

y=a+bx+cz+u

with parameters a, b, c, dependent variable y, independent variables x and z, and error term u. We wish to know the effect of x itself upon y (that is, we wish to obtain an estimate of b). But suppose that we omit z from the regression, and suppose the relation between x and z is given by

z=d+fx+e

with parameters d, f and error term e. Substituting the second equation into the first gives

y=(a+cd)+(b+cf)x+(u+ce).

If a regression of y is conducted upon x only, this last equation is what is estimated, and the regression coefficient on x is actually an estimate of (b+cf ), giving not simply an estimate of the desired direct effect of x upon y (which is b), but rather of its sum with the indirect effect (the effect f of x on z times the effect c of z on y). Thus by omitting the variable z from the regression, we have estimated the total derivative of y with respect to x rather than its partial derivative with respect to x. These differ if both c and f are non-zero.

Detailed analysis

As an example, consider a linear model of the form

y_i = x_i \beta + z_i \delta + u_i,\qquad i = 1,\dots,n

where

We collect the observations of all variables subscripted i = 1, ..., n, and stack them one below another, to obtain the matrix X and the vectors Y, Z, and U:

 X = \left[ \begin{array}{c} x_1 \\  \vdots \\ x_n \end{array} \right] \in \mathbb{R}^{n\times p},

and

 Y = \left[ \begin{array}{c} y_1 \\  \vdots \\ y_n \end{array} \right],\quad  Z = \left[ \begin{array}{c} z_1 \\  \vdots \\ z_n \end{array} \right],\quad  U = \left[ \begin{array}{c} u_1 \\  \vdots \\ u_n \end{array} \right] \in \mathbb{R}^{n\times 1}.

If the independent variable z is omitted from the regression, then the estimated values of the response parameters of the other independent variables will be given by, by the usual least squares calculation,

\hat{\beta} = (X'X)^{-1}X'Y\,

(where the "prime" notation means the transpose of a matrix and the -1 superscript is matrix inversion).

Substituting for Y based on the assumed linear model,


\begin{align}
\hat{\beta} & = (X'X)^{-1}X'(X\beta+Z\delta+U) \\
& =(X'X)^{-1}X'X\beta + (X'X)^{-1}X'Z\delta + (X'X)^{-1}X'U \\
& =\beta + (X'X)^{-1}X'Z\delta + (X'X)^{-1}X'U.
\end{align}

On taking expectations, the contribution of the final term is zero; this follows from the assumption that U is uncorrelated with the regressors X. On simplifying the remaining terms:


\begin{align}
E[ \hat{\beta} | X ] & = \beta + (X'X)^{-1}X'Z\delta \\
& = \beta + \text{bias}.
\end{align}

The second term after the equal sign is the omitted-variable bias in this case, which is non-zero if the omitted variable z is correlated with any of the included variables in the matrix X (that is, if X'Z does not equal a vector of zeroes). Note that the bias is equal to the weighted portion of zi which is "explained" by xi.

Effects on ordinary least squares

The Gauss–Markov theorem states that regression models which fulfill the classical linear regression model assumptions provide the best, linear and unbiased estimators. With respect to ordinary least squares, the relevant assumption of the classical linear regression model is that the error term is uncorrelated with the regressors.

The presence of omitted-variable bias violates this particular assumption. The violation causes the OLS estimator to be biased and inconsistent. The direction of the bias depends on the estimators as well as the covariance between the regressors and the omitted variables. A positive covariance of the omitted variable with both a regressor and the dependent variable will lead the OLS estimate of the included regressor's coefficient to be greater than the true value of that coefficient. This effect can be seen by taking the expectation of the parameter, as shown in the previous section.

See also

References

This article is issued from Wikipedia - version of the Tuesday, February 09, 2016. The text is available under the Creative Commons Attribution/Share Alike but additional terms may apply for the media files.