Residual sum of squares

From Wikipedia, the free encyclopedia

In statistics, the residual sum of squares (RSS) is the sum of squares of residuals. It is the discrepancy between the data and our estimation model. The smaller this discrepancy is, the better the estimation will be.

RSS = \sum_{i=1}^n (y_i - f(x_i))^2.

In a standard regression model y_i = a+bx_i+\varepsilon_i\,, where a and b are coefficients, y and x are the regressand and the regressor, respectively, and ε is the "error term." The sum of squares of residuals is the sum of squares of estimates of εi, that is

RSS = \sum_{i=1}^n (y_i - (a+bx_i))^2.

In general: total sum of squares = explained sum of squares + residual sum of squares.

[edit] See also