BHHH algorithm

BHHH is an optimization algorithm in econometrics similar to Gauss–Newton algorithm. It is an acronym of the four originators: Berndt, B. Hall, R. Hall, and Jerry Hausman.

Usage

If a nonlinear model is fitted to the data one often needs to estimate coefficients through optimization. A number of optimisation algorithms have the following general structure. Suppose that the function to be optimized is Q(β). Then the algorithms are iterative, defining a sequence of approximations, βk given by

\beta_{k%2B1}=\beta_{k}-\lambda_{k}A_{k}\frac{\partial Q}{\partial \beta}(\beta_{k}),,

where \beta_{k} is the coefficient at step k, and \lambda_{k} is a parameter which partly determines the particular algorithm. For the BHHH algorithm λk is determined by calculations within a given iterative step, involving a line-search until a point βk+1 is found satisfying certain criteria. In addition, for the BHHH algorithm, Q has the form

Q = \sum_{i=1}^{N} Q_i

and A is calculated using

A_{k}=\left[1/N\sum_{i=1}^{N}\frac{\partial \ln Q_i}{\partial \beta}(\beta_{k})\frac{\partial \ln Q_i}{\partial \beta}(\beta_{k})'\right]^{-1} .

In other cases, e.g. Newton-Raphson, A_{k} can have other forms. The BHHH algorithm has the advantage that, if certain conditions apply, convergence of the iterative procedure is guaranteed.

Literature