# Moment problem

Short description: Trying to map moments to a measure that generates them Example: Given the mean and variance $\displaystyle{ \sigma^2 }$ (as well as all further cumulants equal 0) the normal distribution is the distribution solving the moment problem.

In mathematics, a moment problem arises as the result of trying to invert the mapping that takes a measure μ to the sequences of moments

$\displaystyle{ m_n = \int_{-\infty}^\infty x^n \,d\mu(x)\,. }$

More generally, one may consider

$\displaystyle{ m_n = \int_{-\infty}^\infty M_n(x) \,d\mu(x)\,. }$

for an arbitrary sequence of functions Mn.

## Introduction

In the classical setting, μ is a measure on the real line, and M is the sequence { xn : n = 0, 1, 2, ... }. In this form the question appears in probability theory, asking whether there is a probability measure having specified mean, variance and so on, and whether it is unique.

There are three named classical moment problems: the Hamburger moment problem in which the support of μ is allowed to be the whole real line; the Stieltjes moment problem, for [0, +∞); and the Hausdorff moment problem for a bounded interval, which without loss of generality may be taken as [0, 1].

## Existence

A sequence of numbers mn is the sequence of moments of a measure μ if and only if a certain positivity condition is fulfilled; namely, the Hankel matrices Hn,

$\displaystyle{ (H_n)_{ij} = m_{i+j}\,, }$

should be positive semi-definite. This is because a positive-semidefinite Hankel matrix corresponds to a linear functional $\displaystyle{ \Lambda }$ such that $\displaystyle{ \Lambda(x^n) = m_n }$ and $\displaystyle{ \Lambda(f^2) \geq 0 }$ (non-negative for sum of squares of polynomials). Assume $\displaystyle{ \Lambda }$ can be extended to $\displaystyle{ \mathbb{R}[x]^* }$. In the univariate case, a non-negative polynomial can always be written as a sum of squares. So the linear functional $\displaystyle{ \Lambda }$ is positive for all the non-negative polynomials in the univariate case. By Haviland's theorem, the linear functional has a measure form, that is $\displaystyle{ \Lambda(x^n) = \int_{-\infty}^{\infty} x^n d \mu }$. A condition of similar form is necessary and sufficient for the existence of a measure $\displaystyle{ \mu }$ supported on a given interval [ab].

One way to prove these results is to consider the linear functional $\displaystyle{ \varphi }$ that sends a polynomial

$\displaystyle{ P(x) = \sum_k a_k x^k }$

to

$\displaystyle{ \sum_k a_k m_k. }$

If mkn are the moments of some measure μ supported on [ab], then evidently

$\displaystyle{ \varphi(P) \ge 0 }$ for any polynomial P that is non-negative on [ab].

(1)

Vice versa, if (1) holds, one can apply the M. Riesz extension theorem and extend $\displaystyle{ \varphi }$ to a functional on the space of continuous functions with compact support C0([ab]), so that

$\displaystyle{ \varphi(f) \ge 0 }$ for any $\displaystyle{ f \in C_0([a,b]),\;f\ge 0. }$

(2)

By the Riesz representation theorem, (2) holds iff there exists a measure μ supported on [ab], such that

$\displaystyle{ \varphi(f) = \int f \, d\mu }$

for every ƒ ∈ C0([ab]).

Thus the existence of the measure $\displaystyle{ \mu }$ is equivalent to (1). Using a representation theorem for positive polynomials on [ab], one can reformulate (1) as a condition on Hankel matrices.

See Shohat & Tamarkin 1943 and Krein & Nudelman 1977 for more details.

## Uniqueness (or determinacy)

The uniqueness of μ in the Hausdorff moment problem follows from the Weierstrass approximation theorem, which states that polynomials are dense under the uniform norm in the space of continuous functions on [0, 1]. For the problem on an infinite interval, uniqueness is a more delicate question; see Carleman's condition, Krein's condition and (Akhiezer 1965). There are distributions, such as log-normal distributions, which have finite moments for all the positive integers but where other distributions have the same moments.

## Variations

An important variation is the truncated moment problem, which studies the properties of measures with fixed first k moments (for a finite k). Results on the truncated moment problem have numerous applications to extremal problems, optimisation and limit theorems in probability theory. See also: Chebyshev–Markov–Stieltjes inequalities and Krein & Nudelman 1977.