Averaging argument

From HandWiki

In computational complexity theory and cryptography, averaging argument is a standard argument for proving theorems. It usually allows us to convert probabilistic polynomial-time algorithms into non-uniform polynomial-size circuits.


Example: If every person likes at least 1/3 of the books in a library, then the library has a book, which at least 1/3 of people like.

Proof: Suppose there are [math]\displaystyle{ N }[/math] people and B books. Each person likes at least [math]\displaystyle{ B/3 }[/math] of the books. Let people leave a mark on the book they like. Then, there will be at least [math]\displaystyle{ M=(NB)/3 }[/math] marks. The averaging argument claims that there exists a book with at least [math]\displaystyle{ N/3 }[/math] marks on it. Assume, to the contradiction, that no such book exists. Then, every book has fewer than [math]\displaystyle{ N/3 }[/math] marks. However, since there are [math]\displaystyle{ B }[/math] books, the total number of marks will be fewer than [math]\displaystyle{ (NB)/3 }[/math], contradicting the fact that there are at least [math]\displaystyle{ M }[/math] marks. [math]\displaystyle{ \scriptstyle\blacksquare }[/math]

Formalized definition of averaging argument

Let X and Y be sets, let p be a predicate on X × Y and let f be a real number in the interval [0, 1]. If for each x in X and at least f |Y| of the elements y in Y satisfy p(x, y), then there exists a y in Y such that there exist at least f |X| elements x in X that satisfy p(x, y).

There is another definition, defined using the terminology of probability theory.[1]

Let [math]\displaystyle{ f }[/math] be some function. The averaging argument is the following claim: if we have a circuit [math]\displaystyle{ C }[/math] such that [math]\displaystyle{ C(x, y) = f(x) }[/math] with probability at least [math]\displaystyle{ \rho }[/math], where [math]\displaystyle{ x }[/math] is chosen at random and [math]\displaystyle{ y }[/math] is chosen independently from some distribution [math]\displaystyle{ Y }[/math] over [math]\displaystyle{ \{0, 1\}^m }[/math] (which might not even be efficiently sampleable) then there exists a single string [math]\displaystyle{ y_0 \in \{0, 1\}^m }[/math] such that [math]\displaystyle{ \Pr_x[C(x, y_0) = f(x)] \ge \rho }[/math].

Indeed, for every [math]\displaystyle{ y }[/math] define [math]\displaystyle{ p_y }[/math] to be [math]\displaystyle{ \Pr_x[C(x, y) = f(x)] }[/math] then

[math]\displaystyle{ \Pr_{x,y}[C(x, y) = f(x)] = E_y[p_y] \, }[/math]

and then this reduces to the claim that for every random variable [math]\displaystyle{ Z }[/math], if [math]\displaystyle{ E[Z] \ge \rho }[/math] then [math]\displaystyle{ \Pr[Z \ge \rho] \gt 0 }[/math] (this holds since [math]\displaystyle{ E[Z] }[/math] is the weighted average of [math]\displaystyle{ Z }[/math] and clearly if the average of some values is at least [math]\displaystyle{ \rho }[/math] then one of the values must be at least [math]\displaystyle{ \rho }[/math]).


This argument has wide use in complexity theory (e.g. proving [math]\displaystyle{ \mathsf{BPP}\subsetneq\mathsf{P/poly} }[/math]) and cryptography (e.g. proving that indistinguishable encryption results in semantic security). A plethora of such applications can be found in Goldreich's books.[2][3][4]


  1. Barak, Boaz (March 2006). "Note on the averaging and hybrid arguments and prediction vs. distinguishing". Princeton University. http://www.cs.princeton.edu/courses/archive/spr06/cos522/averaging.pdf. 
  2. Oded Goldreich, Foundations of Cryptography, Volume 1: Basic Tools, Cambridge University Press, 2001, ISBN:0-521-79172-3
  3. Oded Goldreich, Foundations of Cryptography, Volume 2: Basic Applications, Cambridge University Press, 2004, ISBN:0-521-83084-2
  4. Oded Goldreich, Computational Complexity: A Conceptual Perspective, Cambridge University Press, 2008, ISBN:0-521-88473-X