Hamburger moment problem

From Wikipedia, the free encyclopedia

In mathematics, the Hamburger moment problem, named after Hans Ludwig Hamburger, is formulated as follows: given a sequence { mn : n = 1, 2, 3, ... }, does there exist a positive Borel measure μ on the real line such that

m_{n}=\int _{{-\infty }}^{\infty }x^{n}\,d\mu (x)\ ?

In other words, an affirmative answer to the problem means that { mn : n = 0, 1, 2, ... } is the sequence of moments of some positive Borel measure μ.

The Stieltjes moment problem, Vorobyev moment problem, and the Hausdorff moment problem are similar but replace the real line by [0, +∞) (Stieltjes and Vorobyev; but Vorobyev formulates the problem in the terms of matrix theory), or a bounded interval (Hausdorff).

Characterization

The Hamburger moment problem is solvable (that is, {mn} is a sequence of moments) if and only if the corresponding Hankel kernel on the nonnegative integers

A=\left({\begin{matrix}m_{0}&m_{1}&m_{2}&\cdots \\m_{1}&m_{2}&m_{3}&\cdots \\m_{2}&m_{3}&m_{4}&\cdots \\\vdots &\vdots &\vdots &\ddots \\\end{matrix}}\right)

is positive definite, i.e.,

\sum _{{j,k\geq 0}}m_{{j+k}}c_{j}{\bar  c}_{k}\geq 0

for an arbitrary sequence {cj}j ≥ 0 of complex numbers with finite support (i.e. cj = 0 except for finitely many values of j).

The "only if" part of the claims can be verified by a direct calculation.

We sketch an argument for the converse. Let Z+ be the nonnegative integers and F0(Z+) denote the family of complex valued sequences with finite support. The positive Hankel kernel A induces a (possibly degenerate) sesquilinear product on the family of complex valued sequences with finite support. This in turn gives a Hilbert space

({\mathcal  {H}},\langle ,\;\rangle )

whose typical element is an equivalence class denoted by [f].

Let en be the element in F0(Z+) defined by en(m) = δnm. One notices that

\langle [e_{{n+1}}],[e_{m}]\rangle =A_{{m,n+1}}=m_{{m+n+1}}=\langle [e_{n}],[e_{{m+1}}]\rangle .

Therefore the "shift" operator T on {\mathcal  {H}}, with T[en] = [en + 1], is symmetric.

On the other hand, the desired expression

m_{n}=\int _{{-\infty }}^{\infty }x^{n}\,d\mu (x).

suggests that μ is the spectral measure of a self-adjoint operator. If we can find a "function model" such that the symmetric operator T is multiplication by x, then the spectral resolution of a self-adjoint extension of T proves the claim.

A function model is given by the natural isomorphism from F0(Z+) to the family of polynomials, in one single real variable and complex coefficients: for n  0, identify en with xn. In the model, the operator T is multiplication by x and a densely defined symmetric operator. It can be shown that T always has self-adjoint extensions. Let

{\bar  {T}}\,

be one of them and μ be its spectral measure. So

\langle {\bar  {T}}^{n}[1],[1]\rangle =\int x^{n}d\mu (x).

On the other hand,

\langle {\bar  {T}}^{n}[1],[1]\rangle =\langle T^{n}[e_{0}],[e_{0}]\rangle =m_{n}.\,

Uniqueness of solutions

The solutions form a convex set, so the problem has either infinitely many solutions or a unique solution.

Consider the (n + 1)×(n + 1) Hankel matrix

\Delta _{n}=\left[{\begin{matrix}m_{0}&m_{1}&m_{2}&\cdots &m_{{n}}\\m_{1}&m_{2}&m_{3}&\cdots &m_{{n+1}}\\m_{2}&m_{3}&m_{4}&\cdots &m_{{n+2}}\\\vdots &\vdots &\vdots &\ddots &\vdots \\m_{{n}}&m_{{n+1}}&m_{{n+2}}&\cdots &m_{{2n}}\end{matrix}}\right].

Positivity of A means that for each n, det(Δn)  0. If det(Δn) = 0, for some n, then

({\mathcal  {H}},\langle ,\;\rangle )

is finite dimensional and T is self-adjoint. So in this case the solution to the Hamburger moment problem is unique and μ, being the spectral measure of T, has finite support.

More generally, the solution is unique if there are constants C and D such that for all n, |mn|≤ CDnn! (Reed & Simon 1975, p. 205). This follows from the more general Carleman's condition.

There are examples where the solution is not unique.

Further results

One can see that the Hamburger moment problem is intimately related to orthogonal polynomials on the real line. The Gram–Schmidt procedure gives a basis of orthogonal polynomials in which the operator

{\bar  {T}}\,

has a tridiagonal Jacobi matrix representation. This in turn leads to a tridiagonal model of positive Hankel kernels.

An explicit calculation of the Cayley transform of T shows the connection with what is called the Nevanlinna class of analytic functions on the left half plane. Passing to the non-commutative setting, this motivates Krein's formula which parametrizes the extensions of partial isometries.

The cumulative distribution function and the probability density function can often be found by applying the inverse Laplace transform to the moment generating function

m(t)=\sum _{{n=0}}m_{n}{\frac  {t^{n}}{n!}},

provided that this function converges.

References

  • Reed, Michael; Simon, Barry (1975), Fourier Analysis, Self-Adjointness, Methods of modern mathematical physics 2, Academic Press, pp. 145, 205, ISBN 0-12-585002-6 
  • Shohat, J. A.; Tamarkin, J. D. (1943), The Problem of Moments, New York: American mathematical society, ISBN 0-8218-1501-6 .
This article is issued from Wikipedia. The text is available under the Creative Commons Attribution/Share Alike; additional terms may apply for the media files.