Hannan-Quinn information criterion

From Wikipedia, the free encyclopedia

Information criteria are often used as a guide in model selection (see for example, Grasa 1989). The Kullback-Leibler quantity of information contained in a model is the distance from the “true” model and is measured by the log likelihood function. The notion of an information criterion is to provide a measure of information that strikes a balance between this measure of goodness of fit and parsimonious specification of the model. The various information criteria differ in how to strike this balance.


In statistics, the Hannan-Quinn information criterion (HQC) is an alternative to Akaike Information Criterion (AIC) and Bayesian information criterion (BIC). It is given as

 \mathrm{HQC} = n \ln \left( {\mathrm{RSS} \over n } \right) + 2 k \ln \ln(n). \

where k is the number of parameters, n is the number of observations and RSS is the fitted residual sum of squares of a linear regression.