Marcinkiewicz–Zygmund inequality
In mathematics, the Marcinkiewicz–Zygmund inequality, named after Józef Marcinkiewicz and Antoni Zygmund, gives relations between moments of a collection of independent random variables. It is a generalization of the rule for the sum of variances of independent random variables to moments of arbitrary order. It is a special case of the Burkholder-Davis-Gundy inequality in the case of discrete-time martingales.
Statement of the inequality
Theorem [1][2] If [math]\displaystyle{ \textstyle X_{i} }[/math], [math]\displaystyle{ \textstyle i=1,\ldots,n }[/math], are independent random variables such that [math]\displaystyle{ \textstyle E\left( X_{i}\right) =0 }[/math] and [math]\displaystyle{ \textstyle E\left( \left\vert X_{i}\right\vert ^{p}\right) \lt +\infty }[/math], [math]\displaystyle{ \textstyle 1\leq p\lt +\infty }[/math], then
- [math]\displaystyle{ A_{p}E\left( \left( \sum_{i=1}^{n}\left\vert X_{i}\right\vert ^{2}\right) _{{}}^{p/2}\right) \leq E\left( \left\vert \sum_{i=1}^{n}X_{i}\right\vert ^{p}\right) \leq B_{p}E\left( \left( \sum_{i=1}^{n}\left\vert X_{i}\right\vert ^{2}\right) _{{}}^{p/2}\right) }[/math]
where [math]\displaystyle{ \textstyle A_{p} }[/math] and [math]\displaystyle{ \textstyle B_{p} }[/math] are positive constants, which depend only on [math]\displaystyle{ \textstyle p }[/math] and not on the underlying distribution of the random variables involved.
The second-order case
In the case [math]\displaystyle{ \textstyle p=2 }[/math], the inequality holds with [math]\displaystyle{ \textstyle A_{2}=B_{2}=1 }[/math], and it reduces to the rule for the sum of variances of independent random variables with zero mean, known from elementary statistics: If [math]\displaystyle{ \textstyle E\left( X_{i}\right) =0 }[/math] and [math]\displaystyle{ \textstyle E\left( \left\vert X_{i}\right\vert ^{2}\right) \lt +\infty }[/math], then
- [math]\displaystyle{ \mathrm{Var}\left(\sum_{i=1}^{n}X_{i}\right)=E\left( \left\vert \sum_{i=1}^{n}X_{i}\right\vert ^{2}\right) =\sum_{i=1}^{n}\sum_{j=1}^{n}E\left( X_{i}\overline{X}_{j}\right) =\sum_{i=1}^{n}E\left( \left\vert X_{i}\right\vert ^{2}\right) =\sum_{i=1}^{n}\mathrm{Var}\left(X_{i}\right). }[/math]
See also
Several similar moment inequalities are known as Khintchine inequality and Rosenthal inequalities, and there are also extensions to more general symmetric statistics of independent random variables.[3]
Notes
- ↑ J. Marcinkiewicz and A. Zygmund. Sur les fonctions indépendantes. Fund. Math., 28:60–90, 1937. Reprinted in Józef Marcinkiewicz, Collected papers, edited by Antoni Zygmund, Panstwowe Wydawnictwo Naukowe, Warsaw, 1964, pp. 233–259.
- ↑ Yuan Shih Chow and Henry Teicher. Probability theory. Independence, interchangeability, martingales. Springer-Verlag, New York, second edition, 1988.
- ↑ R. Ibragimov and Sh. Sharakhmetov. Analogues of Khintchine, Marcinkiewicz–Zygmund and Rosenthal inequalities for symmetric statistics. Scandinavian Journal of Statistics, 26(4):621–633, 1999.
Original source: https://en.wikipedia.org/wiki/Marcinkiewicz–Zygmund inequality.
Read more |