QR algorithm

From Wikipedia, the free encyclopedia

In numerical analysis of matrices, a QR algorithm is an eigenvalue algorithm; that is, a procedure to calculate the eigenvalues and eigenvectors of a matrix. The basic idea is to perform a QR decomposition, writing the matrix as a product of an orthogonal matrix and an upper triangular matrix, multiply the factors in the other order, and iterate.

Formally, let A be the matrix of which we want to compute the eigenvalues, and let A0:=A. At the k-th step (starting with k = 0), we write Ak as the product of an orthogonal matrix Qk and a upper triangular matrix Rk and we form Ak+1 = RkQk. Note that

 A_{k+1} = R_k Q_k = Q_k^T Q_k R_k Q_k = Q_k^T A_k Q_k = Q_k^{-1} A_k Q_k,

so all the Ak are similar and hence they have the same eigenvalues. The algorithm is numerically stable because it proceeds by orthogonal similarity transforms.

Under certain conditions[1], the matrices Ak converge to a triangular matrix. The eigenvalues of a triangular matrix are listed on the diagonal, and the eigenvalue problem is solved. In testing for convergence it is impractical to require exact zeros, but the Gershgorin circle theorem provides a bound on the error.

In this crude form the iterations are relatively expensive. This can be mitigated by first bringing the matrix A to upper Hessenberg form (which costs \begin{matrix}\frac{5}{3}\end{matrix} n^3 + O(n^2) using Householder reduction) with a finite sequence of orthogonal similarity transforms, much like a QR decomposition. Determining the QR decomposition of an upper Hessenberg matrix costs 6n2 + O(n).

If the original matrix is symmetric, then the upper Hessenberg matrix is also symmetric and thus tridiagonal, and so are all the Ak. This procedure costs \begin{matrix}\frac{2}{3}\end{matrix} n^3 + O(n^2) using Householder reduction. Determining the QR decomposition of a tridiagonal matrix costs O(n).

The rate of convergence depends on the separation between eigenvalues, so a practical algorithm will use shifts, either explicit or implicit, to increase separation and accelerate convergence. A typical symmetric QR algorithm isolates each eigenvalue (then reduces the size of the matrix) with only one or two iterations, making it efficient as well as robust.

[edit] Motivation

The QR algorithm can be seen as a more sophisticated variation of the basic 'power' eigenvalue algorithm. Recall that the power algorithm repeatedly multiplies A times a single vector, normalizing after each iteration. The vector converges to the eigenvector of the largest eigenvalue. Instead, the QR algorithm works with a complete basis of vectors, using QR decomposition to renormalize (and orthogonalize). For a symmetric matrix A, upon convergence, AQ = , where Λ is the diagonal matrix of eigenvalues to which A converged, and where Q is a composite of all the orthogonal similarity transforms required to get there. Thus the columns of Q are the eigenvectors.

[edit] External links

[edit] References

  1. ^ Golub, G. H. and Van Loan, C. F.: Matrix Computations, 3rd ed., Johns Hopkins University Press, Baltimore, 1996, ISBN 0-8018-5414-8.
Languages