Perron–Frobenius theorem
From Wikipedia, the free encyclopedia
In mathematics, the Perron–Frobenius theorem, named after Oskar Perron and Ferdinand Georg Frobenius, is a theorem in matrix theory about the eigenvalues and eigenvectors of a real positive n×n matrix:
Let A = (aij) be a real n×n matrix with positive entries aij > 0. Then the following statements hold:
- there is a positive real eigenvalue r of A such that any other eigenvalue λ satisfies | λ | < r.
- the eigenvalue r is simple: r is a simple root of the characteristic polynomial of A. In particular both the right and left eigenspace associated to r are 1-dimensional.
- there is a left (respectively right) eigenvector associated with r having positive entries. This means that there exists a row-vector and a column-vector with positive entries such that . The vector v (resp. w) is then called a left (resp. right) eigenvector associated with r. In particular there exist two uniquely determined left (resp. right) positive eigenvectors associated with r (sometimes also called "stochastic" eigenvectors) vnorm and wnorm such that .
- one has the eigenvalue estimate
The first statement says that the spectral radius of the matrix A coincides with r. The theorem applies in particular to a positive stochastic matrix. A right (respectively left) stochastic matrix A is a non-negative real matrix such that its row sums (respectively column sums) are all equal to 1. In this case the Perron–Frobenius theorem asserts that (provided all entries are strictly positive) the eigenvalue λ = 1 is simple and all other eigenvalues of A satisfy | λ | < 1. Also, in this case there exists a vector having positive entries, summing to 1, which is a right (resp. left) positive eigenvector associated to the eigenvalue λ = 1. Both properties can then be used in combination to show that the limit exists and is a positive stochastic matrix of matrix rank one. If A is left (resp. right) stochastic then is again left (resp. right) stochastic. Its entries are determined by the stochastic left resp. right eigenvectors vnorm and wnorm introduced above. If A is right (resp. left) stochastic then the entry aij of is equal to the jth entry of vnorm (resp. the ith entry of wnorm).
This result has a natural interpretation in the theory of finite Markov chains (where it is the matrix-theoretic equivalent of the convergence of a finite Markov chain, formulated in terms of the transition matrix of the chain; see, for example, the article on the subshift of finite type). More generally, it is frequently applied in the theory of transfer operators, where it is commonly known as the Ruelle-Perron–Frobenius theorem (named after David Ruelle). In this case, the leading eigenvalue corresponds to the thermodynamic equilibrium of a dynamical system, and the lesser eigenvalues to the decay modes of a system that is not in equilibrium.
The Perron–Frobenius theorem can be further generalized to the class of block-indecomposable non-negative matrices (called "irreducible" in reference [1] below, also called regular in the stochastic case). In particular it also holds if some positive power B = Ak, k > 0 of the non-negative matrix A has positive entries.
This generalization of the Perron-Frobenius theorem has particular use in algebraic graph theory. The "underlying graph" of a nonnegative real matrix is the graph with vertices and arc ij if and only if . If the underlying graph of such a matrix is strongly connected, then the matrix is irreducible, and thus the generalized Perron-Frobenius theorem applies. In particular, the adjacency matrix of a connected graph is irreducible.
[edit] Perron-Frobenius theorem for non-negative matrices
Let A = (aij) be a real n×n matrix with non-negative entries . Then the following statements hold:
- there is a real eigenvalue r of A such that any other eigenvalue λ satisfies . This property may also be stated more concisely by saying that the spectral radius of A is an eigenvalue.
- there is a left (respectively right) eigenvector associated with r having non-negative entries.
- one has the eigenvalue estimate
With respect to the theorem above related to positive matrices, the left and right eigenvectors associated with the Perron root r are no longer guaranteed to be positive; but remain non-negative. Furthermore, the Perron root is no longer necessarily simple. If one requires the matrix A to be irreducible (its associated graph is connected) as well as non-negative, the eigenvector has (strictly) positive entries. Note that a positive matrix is irreducible (as its associated graph is fully connected) but the converse is not necessarily true. And if A is primitive (A^k>0 for some k), then all the results above given for the case of a positive matrix apply.
[edit] References
- R.A. Horn and C.R. Johnson, Matrix Analysis, Cambridge University Press, 1990 (chapter 8).
- A. Graham, Nonnegative Matrices and Applicable Topics in Linear Algebra, John Wiley&Sons, New York, 1987.
- Henryk Minc, Nonnegative matrices, John Wiley&Sons, New York, 1988, ISBN 0-471-83966-3
- C. Godsil and G. Royle, Algebraic Graph Theory, Springer, 2001 (chapter 8).
- A. Berman and R. J. Plemmons, Nonnegative Matrices in the Mathematical Sciences, Academic Press, 1979 (chapter 2), ISBN 0-12-092250-9