Eaton's inequality
In probability theory, Eaton's inequality is a bound on the largest values of a linear combination of bounded random variables. This inequality was described in 1974 by Morris L. Eaton.[1]
Statement of the inequality
Let {Xi} be a set of real independent random variables, each with an expected value of zero and bounded above by 1 ( |Xi | ≤ 1, for 1 ≤ i ≤ n). The variates do not have to be identically or symmetrically distributed. Let {ai} be a set of n fixed real numbers with
- [math]\displaystyle{ \sum_{ i = 1 }^n a_i^2 = 1 . }[/math]
Eaton showed that
- [math]\displaystyle{ P\left( \left| \sum_{ i = 1 }^n a_i X_i \right| \ge k \right) \le 2 \inf_{ 0 \le c \le k } \int_c^\infty \left( \frac{ z - c }{ k - c } \right)^3 \phi( z ) \, dz = 2 B_E( k ) , }[/math]
where φ(x) is the probability density function of the standard normal distribution.
A related bound is Edelman's[citation needed]
- [math]\displaystyle{ P\left( \left| \sum_{ i = 1 }^n a_i X_i \right| \ge k \right) \le 2 \left( 1 - \Phi\left[ k - \frac{ 1.5 }{ k } \right] \right) = 2 B_{ Ed }( k ) , }[/math]
where Φ(x) is cumulative distribution function of the standard normal distribution.
Pinelis has shown that Eaton's bound can be sharpened:[2]
- [math]\displaystyle{ B_{ EP } = \min\{ 1, k^{ -2 }, 2 B_E \} }[/math]
A set of critical values for Eaton's bound have been determined.[3]
Related inequalities
Let {ai} be a set of independent Rademacher random variables – P( ai = 1 ) = P( ai = −1 ) = 1/2. Let Z be a normally distributed variate with a mean 0 and variance of 1. Let {bi} be a set of n fixed real numbers such that
- [math]\displaystyle{ \sum_{ i = 1 }^n b_i^2 = 1 . }[/math]
This last condition is required by the Riesz–Fischer theorem which states that
- [math]\displaystyle{ a_i b_i + \cdots + a_n b_n }[/math]
will converge if and only if
- [math]\displaystyle{ \sum_{ i = 1 }^n b_i^2 }[/math]
is finite.
Then
- [math]\displaystyle{ E f( a_i b_i + \cdots + a_n b_n ) \le E f( Z ) }[/math]
for f(x) = | x |p. The case for p ≥ 3 was proved by Whittle[4] and p ≥ 2 was proved by Haagerup.[5]
If f(x) = eλx with λ ≥ 0 then
- [math]\displaystyle{ E f( a_i b_i + \cdots + a_n b_n ) \le \inf \left[ \frac{ E ( e^{ \lambda Z } ) }{ e^{ \lambda x } } \right] = e^{ -x^2 / 2 } }[/math]
where inf is the infimum.[6]
Let
- [math]\displaystyle{ S_n = a_i b_i + \cdots + a_n b_n }[/math]
Then[7]
- [math]\displaystyle{ P( S_n \ge x ) \le \frac{ 2e^3 }{ 9 } P( Z \ge x ) }[/math]
The constant in the last inequality is approximately 4.4634.
An alternative bound is also known:[8]
- [math]\displaystyle{ P( S_n \ge x ) \le e^{ -x^2 / 2 } }[/math]
This last bound is related to the Hoeffding's inequality.
In the uniform case where all the bi = n−1/2 the maximum value of Sn is n1/2. In this case van Zuijlen has shown that[9]
- [math]\displaystyle{ P( | \mu - \sigma | ) \le 0.5 \, }[/math][clarification needed]
where μ is the mean and σ is the standard deviation of the sum.
References
- ↑ Eaton, Morris L. (1974) "A probability inequality for linear combinations of bounded random variables." Annals of Statistics 2(3) 609–614
- ↑ Pinelis, I. (1994) "Extremal probabilistic problems and Hotelling's T2 test under a symmetry condition." Annals of Statistics 22(1), 357–368
- ↑ Dufour, J-M; Hallin, M (1993) "Improved Eaton bounds for linear combinations of bounded random variables, with statistical applications", Journal of the American Statistical Association, 88(243) 1026–1033
- ↑ Whittle P (1960) Bounds for the moments of linear and quadratic forms in independent variables. Teor Verojatnost i Primenen 5: 331–335 MR0133849
- ↑ Haagerup U (1982) The best constants in the Khinchine inequality. Studia Math 70: 231–283 MR0654838
- ↑ Hoeffding W (1963) Probability inequalities for sums of bounded random variables. J Amer Statist Assoc 58: 13–30 MR144363
- ↑ Pinelis I (1994) Optimum bounds for the distributions of martingales in Banach spaces. Ann Probab 22(4):1679–1706
- ↑ de la Pena, VH, Lai TL, Shao Q (2009) Self normalized processes. Springer-Verlag, New York
- ↑ van Zuijlen Martien CA (2011) On a conjecture concerning the sum of independent Rademacher random variables. https://arxiv.org/abs/1112.4988
Original source: https://en.wikipedia.org/wiki/Eaton's inequality.
Read more |