BRS-inequality

From HandWiki
Revision as of 08:10, 27 June 2023 by NBrush (talk | contribs) (change)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

BRS-inequality is the short name for Bruss-Robertson-Steele inequality. This inequality gives a convenient upper bound for the expected maximum number of non-negative random variables one can sum up without exceeding a given upper bound [math]\displaystyle{ s \gt 0 }[/math].

For example, suppose 100 random variables [math]\displaystyle{ X_1, X_2,..., X_{100} }[/math] are all uniformly distributed on [math]\displaystyle{ [0, 1] }[/math], not necessarily independent, and let [math]\displaystyle{ s= 10 }[/math], say. Let [math]\displaystyle{ N[n, s] := N[100, 10] }[/math] be the maximum number of [math]\displaystyle{ X_j }[/math] one can select in [math]\displaystyle{ \{X_1, X_2,..., X_{100}\} }[/math] such that their sum does not exceed [math]\displaystyle{ s= 10 }[/math]. [math]\displaystyle{ N[100, 10] }[/math] is a random variable, so what can one say about bounds for its expectation? How would an upper bound for [math]\displaystyle{ E(N[n, s]) }[/math] behave, if one changes the size [math]\displaystyle{ n }[/math] of the sample and keeps [math]\displaystyle{ s }[/math] fixed, or alternatively, if one keeps [math]\displaystyle{ n }[/math] fixed but varies [math]\displaystyle{ s }[/math]? What can one say about [math]\displaystyle{ E(N[n, s]) }[/math], if the uniform distribution is replaced by another continuous distribution? In all generality, what can one say if each [math]\displaystyle{ X_k }[/math] may have its own continuous distribution function [math]\displaystyle{ F_k }[/math]?

General problem

Let [math]\displaystyle{ X_1, X_2,... }[/math] be a sequence of non-negative random variables (possibly dependent) that are jointly continuously distributed. For [math]\displaystyle{ n \in \{1, 2,... \} }[/math] and [math]\displaystyle{ s\in \mathbb{R}^+ }[/math] let [math]\displaystyle{ N[n, s] }[/math] be the maximum number of observations among [math]\displaystyle{ X_1, X_2, ..., X_n }[/math] that one can sum up without exceeding [math]\displaystyle{ s }[/math].

Now, to obtain [math]\displaystyle{ N[n, s] }[/math] one may think of looking at the list of all observations, first select the smallest one, then add the second smallest, then the third and so on, as long as the accumulated sum does not exceed [math]\displaystyle{ s }[/math]. Hence [math]\displaystyle{ N[s, n] }[/math] can be defined in terms of the increasing order statistics of [math]\displaystyle{ X_1, X_2,\cdots, X_n }[/math], denoted by [math]\displaystyle{ X_{1,n} \le X_{2,n} \le \cdots \le X_{n,n}, }[/math], namely by

[math]\displaystyle{ \begin{align} N[n, s] = \begin{cases} 0 &, {\rm ~if~}~ X_{1,n} \gt s,\\ \max\{~k \in \N :~ X_{1,n} + X_{2,n}+ \cdots + X_{k,n} \le s\} &, {\rm~ otherwise}. \end{cases} \end{align} (1) }[/math]

What is the best possible general upper bound for [math]\displaystyle{ E(N[n, s]) }[/math] if one requires only the continuity of the joint distribution of all variables? And then, how to compute this bound?

Identically distributed random variables.

Theorem 1 Let [math]\displaystyle{ X_1, X_2, \cdots, X_n }[/math] be identically distributed non-negative random variables with absolutely continuous distribution function [math]\displaystyle{ F }[/math]. Then

[math]\displaystyle{ E(N[n, s])\le n F(t), }[/math] (2)

where [math]\displaystyle{ t := t(n, s) }[/math] solves the so-called BRS-equation

[math]\displaystyle{ n \int_0^t x \,dF(x)\, =\, s }[/math]. (3)

As an example, here are the answers for the questions posed at the beginning. Thus let all [math]\displaystyle{ X_1, X_2,\cdots, X_n }[/math] be uniformly distributed on [math]\displaystyle{ [0, 1] }[/math]. Then [math]\displaystyle{ F(t) = t }[/math] on [math]\displaystyle{ [0, 1] }[/math], and hence [math]\displaystyle{ dF(x)/dx = 1 }[/math] on [math]\displaystyle{ [0, 1] }[/math]. The BRS-equation becomes

[math]\displaystyle{ n \int_0^t x dx = n t^2/2 = s. }[/math]

The solution is [math]\displaystyle{ t =\sqrt{2s/n} }[/math], and thus from the inequality (2)

[math]\displaystyle{ E(N[n, s]) \le n\,F(t) = n \sqrt{2s/n }= \sqrt{2sn} }[/math]. (4)

Since one always has [math]\displaystyle{ N[n, s] \le n }[/math], this bound becomes trivial for [math]\displaystyle{ s \ge nE(X) = n/2 }[/math].

For the example questions with [math]\displaystyle{ n=100, s=10 }[/math] this yields [math]\displaystyle{ E(N[100, 10]) \le \sqrt{2000} \approx 44.7 }[/math]. As one sees from (4), doubling the sample size [math]\displaystyle{ n }[/math] and keeping [math]\displaystyle{ s }[/math] fixed, or vice versa, yield for the uniform distribution in the non-trivial case the same upper bound.

Generalised BRS-inequality

Theorem 2. Let [math]\displaystyle{ X_1, X_2,\cdots, X_n }[/math] be positive random variables that are jointly distributed such that [math]\displaystyle{ X_k }[/math] has an absolutely continuous distribution function [math]\displaystyle{ F_k, ~k=1, 2, \cdots,n. }[/math]. If [math]\displaystyle{ N[n, s] }[/math] is defined as before, then

[math]\displaystyle{ E (N[n, s])\le \sum_{k=1}^n F_k(t) }[/math], (5)

where [math]\displaystyle{ t := t(n, s) }[/math] is the unique solution of the BRS-equation

[math]\displaystyle{ \sum_{k=1}^n \int_0^t \,x\,dF_k(x) = s. }[/math] (6)

Clearly, if all random variables [math]\displaystyle{ X_i, i=1, 2, \cdots, n }[/math] have the same marginal distribution [math]\displaystyle{ F }[/math], then (6) recaptures (3), and (5) recaptures (2). Again it should be pointed out that no independence hypothesis whatsoever is needed.

Refinements of the BRS-inequality

Depending on the type of the distributions [math]\displaystyle{ F_k }[/math], refinements of Theorem 2 can be of true interest. We just mention one of them.

Let [math]\displaystyle{ A[n, s] }[/math] be the random set of those variables one can sum up to yield the maximum random number [math]\displaystyle{ N[n, s] }[/math], that is,

[math]\displaystyle{ \#A[n, s] = N[n, s] }[/math],

and let [math]\displaystyle{ S_{A[n,s]} }[/math] denote the sum of these variables. The so-called residual [math]\displaystyle{ s-S_{A[n,s]} }[/math] is by definition always non-negative, and one has:

Theorem 3. Let [math]\displaystyle{ X_1, X_2,\cdots, X_n }[/math] be jointly continuously distributed with marginal distribution functions [math]\displaystyle{ F_k, k=1, 2, \cdots,n }[/math], and let [math]\displaystyle{ t := t(n, s) }[/math] be the solution of (6). Then

[math]\displaystyle{ E(N[n, s])\le \left( \sum_{k=1}^n F_k(t(n, s))\right)-\frac{s-E(S_{A[n,s]})}{t(n,s)} }[/math]. (7)

The improvement in (7) compared with (5) therefore consists of

[math]\displaystyle{ \frac{s-E(S_{A[n,s]})}{t(n,s)} }[/math].

The expected residual in the numerator is typically difficult to compute or estimate, but there exist nice exceptions. For example, if all [math]\displaystyle{ X_k }[/math] are independent exponential random variables, then the memoryless property implies (if s is exceeded) the distributional symmetry of the residual and the overshoot over [math]\displaystyle{ s }[/math]. For fixed [math]\displaystyle{ s }[/math] one can then show that :[math]\displaystyle{ \frac{s-E(S_{A[n,s]})}{t(n,s)} \to 1/2 {\rm~as~} n \to \infty }[/math]. This improvement fluctuates around [math]\displaystyle{ 1/2 }[/math], and the convergence to [math]\displaystyle{ 1/2 }[/math], (simulations) seems quick.

Source

The first version of the BRS-inequality (Theorem 1) was proved in Lemma 4.1 of F. Thomas Bruss and James B. Robertson (1991). This paper proves moreover that the upper bound is asymptotically tight if the random variables are independent of each other. The generalisation to arbitrary continuous distributions (Theorem 2) is due to J. Michael Steele (2016). Theorem 3 and other refinements of the BRS-inequality are more recent and proved in Bruss (2021).

Applications

The BRS-inequality is a versatile tool since it applies to many types of problems, and since the computation of the BRS-equation is often not very involved. Also, and in particular, one notes that the maximum number [math]\displaystyle{ N[n, s] }[/math] always dominates the maximum number of selections under any additional constraint, such as e.g. for online selections without recall. Examples studied in Steele (2016) and Bruss (2021) touch many applications, including comparisons between i.i.d. sequences and non-i.i.d. sequences, problems of condensing point processes, “awkward” processes, selection algorithms, knapsack problems, Borel-Cantelli-type problems, the Bruss-Duerinckx theorem, and online Tiling strategies.

References

Bruss F. T. and Robertson J. B. (1991) ’Wald's Lemma’ for Sums of Order Statistics of i.i.d. Random Variables, Adv. Appl. Probab., Vol. 23, 612-623.

Bruss F. T. and Duerinckx M. (2015), Resource dependent branching processes and the envelope of societie, Ann. of Appl. Probab., Vol. 25 (1), 324-372.

Steele J.M. (2016), The Bruss-Robertson Inequality: Elaborations, Extensions, and Applications, Math. Applicanda, Vol. 44, No 1, 3-16.

Bruss F. T. (2021),The BRS-inequality and its applications, Probab. Surveys, Vol.18, 44-76.