Sylvester's matrix theorem

From Wikipedia, the free encyclopedia

In Matrix theory, Sylvester's matrix theorem allows one to evaluate functions of matrices easily.

Suppose A is a square matrix of size n\times n. If the eigenvalues of A are λi for i=1,\dots,n, and the corresponding normalized row and column eigenvectors are ri and ci respectively, then the theorem states that

f(A)=\sum_{i=1}^n f(\lambda_i)c_ir_i

[edit] Example

Consider a two-by-two matrix:

A=\left(\begin{array}{cc}1&3\\4&2\end{array}\right)

and a function f() that squares its argument: f(A) = AA = A2.

Matrix A has eigenvalues 5 and -2. The row eigenvectors are (1 / 7,1 / 7) and (4, − 3); the column eigenvectors are (3,4)T and (1 / 7, − 1 / 7)T; the 7 is a normalization factor.

Thus c_1r_1=\left(\begin{array}{cc}3/7&3/7\\4/7&4/7\end{array}\right) and c_2r_2=\left(\begin{array}{cc}4/7&-3/7\\-4/7&3/7\end{array}\right).

Sylvester's matrix theorem states that

f(A)= f(\lambda_1)r_1c_1 +f(\lambda_2)r_2c_2= 25\left(\begin{array}{cc}3/7&3/7\\4/7&4/7\end{array}\right)+4\left(\begin{array}{cc}4/7&-3/7\\-4/7&3/7\end{array}\right) =\left(\begin{array}{cc}13&9\\12&16\end{array}\right)= A^2

as required.

Sylvester's theorem is useful for calculating computationally demanding functions such as matrix exponentials eA.

[edit] References