Khintchine inequality
In mathematics, the Khintchine inequality, named after Aleksandr Khinchin and spelled in multiple ways in the Latin alphabet, is a theorem from probability, and is also frequently used in analysis. Heuristically, it says that if we pick [math]\displaystyle{ N }[/math] complex numbers [math]\displaystyle{ x_1,\dots,x_N \in\mathbb{C} }[/math], and add them together each multiplied by a random sign [math]\displaystyle{ \pm 1 }[/math], then the expected value of the sum's modulus, or the modulus it will be closest to on average, will be not too far off from [math]\displaystyle{ \sqrt{|x_1|^{2}+\cdots + |x_N|^{2}} }[/math].
Statement
Let [math]\displaystyle{ \{\varepsilon_n\}_{n=1}^N }[/math] be i.i.d. random variables with [math]\displaystyle{ P(\varepsilon_n=\pm1)=\frac12 }[/math] for [math]\displaystyle{ n=1,\ldots, N }[/math], i.e., a sequence with Rademacher distribution. Let [math]\displaystyle{ 0\lt p\lt \infty }[/math] and let [math]\displaystyle{ x_1,\ldots,x_N\in \mathbb{C} }[/math]. Then
- [math]\displaystyle{ A_p \left( \sum_{n=1}^N |x_n|^2 \right)^{1/2} \leq \left(\operatorname{E} \left|\sum_{n=1}^N \varepsilon_n x_n\right|^p \right)^{1/p} \leq B_p \left(\sum_{n=1}^N |x_n|^2\right)^{1/2} }[/math]
for some constants [math]\displaystyle{ A_p,B_p\gt 0 }[/math] depending only on [math]\displaystyle{ p }[/math] (see Expected value for notation). The sharp values of the constants [math]\displaystyle{ A_p,B_p }[/math] were found by Haagerup (Ref. 2; see Ref. 3 for a simpler proof). It is a simple matter to see that [math]\displaystyle{ A_p = 1 }[/math] when [math]\displaystyle{ p \ge 2 }[/math], and [math]\displaystyle{ B_p = 1 }[/math] when [math]\displaystyle{ 0 \lt p \le 2 }[/math].
Haagerup found that
- [math]\displaystyle{ \begin{align} A_p &= \begin{cases} 2^{1/2-1/p} & 0\lt p\le p_0, \\ 2^{1/2}(\Gamma((p+1)/2)/\sqrt{\pi})^{1/p} & p_0 \lt p \lt 2\\ 1 & 2 \le p \lt \infty \end{cases} \\ &\text{and} \\ B_p &= \begin{cases} 1 & 0 \lt p \le 2 \\ 2^{1/2}(\Gamma((p+1)/2)/\sqrt\pi)^{1/p} & 2 \lt p \lt \infty \end{cases}, \end{align} }[/math]
where [math]\displaystyle{ p_0\approx 1.847 }[/math] and [math]\displaystyle{ \Gamma }[/math] is the Gamma function. One may note in particular that [math]\displaystyle{ B_p }[/math] matches exactly the moments of a normal distribution.
Uses in analysis
The uses of this inequality are not limited to applications in probability theory. One example of its use in analysis is the following: if we let [math]\displaystyle{ T }[/math] be a linear operator between two Lp spaces [math]\displaystyle{ L^p(X,\mu) }[/math] and [math]\displaystyle{ L^p(Y,\nu) }[/math], [math]\displaystyle{ 1 \lt p \lt \infty }[/math], with bounded norm [math]\displaystyle{ \|T\|\lt \infty }[/math], then one can use Khintchine's inequality to show that
- [math]\displaystyle{ \left\|\left(\sum_{n=1}^N |Tf_n|^2 \right)^{1/2} \right\|_{L^p(Y,\nu)}\leq C_p \left\|\left(\sum_{n=1}^N |f_n|^2\right)^{1/2} \right\|_{L^p(X,\mu)} }[/math]
for some constant [math]\displaystyle{ C_p\gt 0 }[/math] depending only on [math]\displaystyle{ p }[/math] and [math]\displaystyle{ \|T\| }[/math].[citation needed]
Generalizations
For the case of Rademacher random variables, Pawel Hitczenko showed[1] that the sharpest version is:
- [math]\displaystyle{ A \left(\sqrt{p}\left(\sum_{n=b+1}^N x_n^2\right)^{1/2} + \sum_{n=1}^b x_n\right) \leq \left(\operatorname{E} \left|\sum_{n=1}^N \varepsilon_n x_n\right|^p \right)^{1/p} \leq B \left(\sqrt{p}\left(\sum_{n=b+1}^N x_n^2\right)^{1/2} + \sum_{n=1}^b x_n\right) }[/math]
where [math]\displaystyle{ b = \lfloor p\rfloor }[/math], and [math]\displaystyle{ A }[/math] and [math]\displaystyle{ B }[/math] are universal constants independent of [math]\displaystyle{ p }[/math].
Here we assume that the [math]\displaystyle{ x_i }[/math] are non-negative and non-increasing.
See also
- Marcinkiewicz–Zygmund inequality
- Burkholder-Davis-Gundy inequality
References
- Thomas H. Wolff, "Lectures on Harmonic Analysis". American Mathematical Society, University Lecture Series vol. 29, 2003. ISBN:0-8218-3449-5
- Uffe Haagerup, "The best constants in the Khintchine inequality", Studia Math. 70 (1981), no. 3, 231–283 (1982).
- Fedor Nazarov and Anatoliy Podkorytov, "Ball, Haagerup, and distribution functions", Complex analysis, operators, and related topics, 247–267, Oper. Theory Adv. Appl., 113, Birkhäuser, Basel, 2000.
Original source: https://en.wikipedia.org/wiki/Khintchine inequality.
Read more |