Hausdorff moment problem
From Wikipedia, the free encyclopedia
In mathematics, the Hausdorff moment problem, named after Felix Hausdorff, asks for necessary and sufficient conditions that a given sequence { μn : n = 1, 2, 3, ... } be the sequence of moments
of some probability distribution, with cumulative distribution function F constant outside the closed unit interval [0, 1]. (This is equivalent to requiring that X take values in [0,1] almost surely.)
The essential difference between this and other well-known moment problems is that this is on a bounded interval, whereas in the Stieltjes moment problem one considers a a half-line [0, ∞), and in the Hamburger moment problem one considers the whole line (−∞, ∞).
In 1921, Hausdorff showed that { μn : n = 1, 2, 3, ... } is such a moment sequence if and only if all of the differences
are non-negative, where is the difference operator given by
For example, it is necessary to have
When one considers that this is the same as
or, generally,
then the necessity of these conditions becomes obvious.
[edit] References
- Hausdorff, F. "Summationsmethoden und Momentfolgen. I." Mathematische Zeitschrift 9, 74-109, 1921.
- Hausdorff, F. "Summationsmethoden und Momentfolgen. II." Mathematische Zeitschrift 9, 280-299, 1921.
- Shohat, J.A.; Tamarkin, J. D. The Problem of Moments, American mathematical society, New York, 1943.