Explained sum of squares
From Wikipedia, the free encyclopedia
In statistics, an explained sum of squares (ESS) is the sum of squared predicted values in a standard regression model (for example yi = a + bxi + εi), where yi is the response variable, xi is the explanatory variable, a and b are coefficients, i indexes the observations from 1 to n, and εi is the error term.
If and are the estimated coefficients, then
is the predicted variable. The ESS is the sum of the squares of the differences of the predicted values and the grand mean:
In general: total sum of squares = explained sum of squares + residual sum of squares.