Hankel matrix

From Wikipedia, the free encyclopedia

In linear algebra, a Hankel matrix, named after Hermann Hankel, is a square matrix with constant (positive sloping) skew-diagonals, e.g.;

\begin{bmatrix} a & b & c & d & e \\ b & c & d & e & f \\ c & d & e & f & g \\  d & e & f & g & h \\ e & f & g & h & i \\ \end{bmatrix}

In mathematical terms:

ai,j = ai − 1,j + 1

The Hankel matrix is closely related to the Toeplitz matrix (a Hankel matrix is an upside-down Toeplitz matrix).

A Hankel operator on a Hilbert space is one whose matrix with respect to an orthonormal basis is an infinite Hankel matrix (a_{i,j})_{i,j \ge 0}, where ai,j depends only on i + j.

Contents

[edit] Hankel transform

The Hankel transform is the name sometimes given to the transformation of a sequence, where the transformed sequence corresponds to the determinant of the Hankel matrix. That is, the sequence {hn} is the Hankel transform of the sequence {bn} when

h_n = \det (b_{i+j})_{0 \le i,j \le n}

Here, ai,j = bi + j is the Hankel matrix of the sequence {bn}. The Hankel transform is invariant under the binomial transform of a sequence. That is, if one writes

c_n = \sum_{k=0}^n {n \choose k} b_k

as the binomial transform of the sequence {bn}, then one has

\det (b_{i+j})_{0 \le i,j \le n} = \det (c_{i+j})_{0 \le i,j \le n}

[edit] Hankel Matrices For System identification

Hankel matrices are formed when given a sequence of output data and a realization of an underlying state-space or hidden Markov model is desired. The Singular Value Decomposition of the Hankel Matrix provides a means of computing the A,B, and C matrices which define the state-space realization.

[edit] Orthogonal polynomials on the real line

[edit] Positive Hankel matrices and the Hamburger moment problem

[edit] Orthogonal polynomials on the real line

[edit] Tridiagonal model of positive Hankel operators

[edit] See also


This algebra-related article is a stub. You can help Wikipedia by expanding it.
In other languages