Explained sum of squares

From Wikipedia, the free encyclopedia

In statistics, an explained sum of squares (ESS) is the sum of squared predicted values in a standard regression model (for example yi = a + bxi + εi), where yi is the response variable, xi is the explanatory variable, a and b are coefficients, i indexes the observations from 1 to n, and εi is the error term.

If \hat{a} and \hat{b} are the estimated coefficients, then

\hat{y_{i}}=\hat{a}+\hat{b}x_{i}

is the predicted variable. The ESS is the sum of the squares of the differences of the predicted values and the grand mean:

\sum_{i=1}^{n}\left(\hat{y}_{i}-\bar{y}\right)^2

In general: total sum of squares = explained sum of squares + residual sum of squares.