Matrix function

From Wikipedia, the free encyclopedia

A matrix function usually denotes a function which maps a matrix to a matrix.

Contents

[edit] Extending scalar functions to matrix functions

There are several techniques for lifting a real function to a square matrix function such that interesting properties are maintained. All of the following techniques yield the same matrix function, but the domains on which the function are defined may differ.

[edit] Power series

If the real function f has the Taylor expansion

f(x) = f(0) + f'(0)\cdot x + f''(0)\cdot \frac{x^2}{2} + \cdots

then a matrix function can be defined by substituting x by a matrix: the powers become matrix powers, the additions become matrix sums and the multiplications become scaling operations. If the real series converges for | x | < r, then the corresponding matrix series will converge for matrix argument A if \|A\| < r for some matrix norm \|\cdot\| which satisfies \|AB\|\leq \|A\|\cdot\|B\|.

[edit] Jordan decomposition

If the matrix A is diagonalizable, then we can find a matrix P and a diagonal matrix D such that A = P\cdot D\cdot P^{-1}. Applying the power series definition to this decomposition, we find that f(A) is defined by

f(A) = P \begin{bmatrix}
f(d_1) & \dots & 0 \\
\vdots & \ddots & \vdots \\
0 & \dots & f(d_n)
\end{bmatrix} P^{-1},

where d_1, \dots, d_n denote the diagonal entries of D.

All matrices, whether they are diagonalizable or not, have a Jordan decomposition A = P\cdot J\cdot P^{-1}, where the matrix J consists of Jordan blocks. Consider these blocks separately and apply the power series to a Jordan block:

 f \left( \begin{bmatrix}
\lambda & 1 & 0 & \ldots & 0 \\
0 & \lambda & 1 & \ldots & 0 \\
\vdots & \ddots & \ddots & \ddots & \vdots \\
0 & \ldots & 0 & 0 & \lambda
\end{bmatrix} \right) = \begin{bmatrix}
\frac{f(\lambda)}{0!} & \frac{f'(\lambda)}{1!} & \frac{f''(\lambda)}{2!} & \ldots & \frac{f^{(n)}(\lambda)}{n!} \\
0 & \frac{f(\lambda)}{0!} & \frac{f'(\lambda)}{1!} & \ldots & \frac{f^{(n-1)}(\lambda)}{(n-1)!} \\
\vdots & \ddots & \ddots & \ddots & \vdots \\
0 & \ldots & 0 & 0 & \frac{f(\lambda)}{0!}
\end{bmatrix}.

This definition can be used to extend the domain of the matrix function beyond the set of matrices with spectral radius smaller than the radius of convergence of the power series. Note that there is also a connection to divided differences.

[edit] Cauchy integral

Cauchy's integral formula from complex analysis can also be used to generalize scalar functions to matrix functions. Cauchy's integral formula states that for any analytic function f defined on a set D ( D \subset\mathbb{C} ), it holds

f(x) = \frac{1}{2\pi i} \oint_C {\frac{f(z)}{z-x}}\, \mathrm{d}z. ,

where C is a closed curve inside the domain D enclosing x. Now replace x by a matrix A and consider a path C inside D that encloses all eigenvalues of A. One possibility to achieve this to let C be a circle around the origin with radius \|A\| for an arbitrary matrix norm \|\cdot\|. Then, f(A) is defined by

f(A) = \frac{1}{2\pi i}  \oint_C {f(z)(zI-A)^{-1}}\, \mathrm{d}z.

This integral can readily be evaluated numerically using the trapezium rule, which converges exponentially in this case. That means that the precision of the result doubles when the number of nodes is doubled.

This idea applied to bounded linear operators on a Banach space, which can be seen as infinite matrices, leads to the holomorphic functional calculus.

[edit] Examples

[edit] See also

Languages