Method of moments (probability theory)

From HandWiki
Revision as of 19:43, 9 July 2021 by imported>LinXED (url)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

In probability theory, the method of moments is a way of proving convergence in distribution by proving convergence of a sequence of moment sequences.[1] Suppose X is a random variable and that all of the moments

[math]\displaystyle{ \operatorname{E}(X^k)\, }[/math]

exist. Further suppose the probability distribution of X is completely determined by its moments, i.e., there is no other probability distribution with the same sequence of moments (cf. the problem of moments). If

[math]\displaystyle{ \lim_{n\to\infty}\operatorname{E}(X_n^k) = \operatorname{E}(X^k)\, }[/math]

for all values of k, then the sequence {Xn} converges to X in distribution.

The method of moments was introduced by Pafnuty Chebyshev for proving the central limit theorem; Chebyshev cited earlier contributions by Irénée-Jules Bienaymé.[2] More recently, it has been applied by Eugene Wigner to prove Wigner's semicircle law, and has since found numerous applications in the theory of random matrices.[3]

Notes

  1. Prokhorov, A.V.. "Moments, method of (in probability theory)". in M. Hazewinkel. Encyclopaedia of Mathematics (online). ISBN 1-4020-0609-8. http://encyclopediaofmath.org/index.php?title=Moments,_method_of_(in_probability_theory)&oldid=47882. 
  2. Fischer, H. (2011). "4. Chebyshev's and Markov's Contributions.". A history of the central limit theorem. From classical to modern probability theory.. Sources and Studies in the History of Mathematics and Physical Sciences. New York: Springer. ISBN 978-0-387-87856-0. 
  3. Anderson, G.W.; Guionnet, A.; Zeitouni, O. (2010). "2.1". An introduction to random matrices.. Cambridge: Cambridge University Press. ISBN 978-0-521-19452-5.