Support $\displaystyle{ k \in \{-1,1\}\, }$ $\displaystyle{ f(k) = \left\{\begin{matrix} 1/2 & \mbox {if }k=-1, \\ 1/2 & \mbox {if }k=+1, \\ 0 & \mbox {otherwise.}\end{matrix}\right. }$ $\displaystyle{ F(k) = \begin{cases} 0, & k \lt -1 \\ 1/2, & -1 \leq k \lt 1 \\ 1, & k \geq 1 \end{cases} }$ $\displaystyle{ 0\, }$ $\displaystyle{ 0\, }$ N/A $\displaystyle{ 1\, }$ $\displaystyle{ 0\, }$ $\displaystyle{ -2\, }$ $\displaystyle{ \ln(2)\, }$ $\displaystyle{ \cosh(t)\, }$ $\displaystyle{ \cos(t)\, }$

In probability theory and statistics, the Rademacher distribution (which is named after Hans Rademacher) is a discrete probability distribution where a random variate X has a 50% chance of being +1 and a 50% chance of being -1.

A series (that is, a sum) of Rademacher distributed variables can be regarded as a simple symmetrical random walk where the step size is 1.

## Mathematical formulation

The probability mass function of this distribution is

$\displaystyle{ f(k) = \left\{\begin{matrix} 1/2 & \mbox {if }k=-1, \\ 1/2 & \mbox {if }k=+1, \\ 0 & \mbox {otherwise.}\end{matrix}\right. }$

In terms of the Dirac delta function, as

$\displaystyle{ f( k ) = \frac{ 1 }{ 2 } \left( \delta \left( k - 1 \right) + \delta \left( k + 1 \right) \right). }$

## Bounds on sums of independent Rademacher variables

There are various results in probability theory around analyzing the sum of i.i.d. Rademacher variables, including concentration inequalities such as Bernstein inequalities as well as anti-concentration inequalities like Tomaszewski's conjecture.

### Concentration inequalities

Let {xi} be a set of random variables with a Rademacher distribution. Let {ai} be a sequence of real numbers. Then

$\displaystyle{ \Pr\left( \sum_i x_i a_i \gt t || a ||_2 \right) \le e^{ - \frac{ t^2 }{ 2 } } }$

where ||a||2 is the Euclidean norm of the sequence {ai}, t > 0 is a real number and Pr(Z) is the probability of event Z.

Let Y = Σ xiai and let Y be an almost surely convergent series in a Banach space. The for t > 0 and s ≥ 1 we have

$\displaystyle{ \Pr\left( || Y || \gt st \right) \le \left[ \frac{ 1 }{ c } \Pr( || Y || \gt t ) \right]^{ cs^2 } }$

for some constant c.

Let p be a positive real number. Then the Khintchine inequality says that

$\displaystyle{ c_1 \left[ \sum{ \left| a_i \right|^2 } \right]^\frac{ 1 }{ 2 } \le \left( E\left[ \left| \sum{ a_i x_i } \right|^p \right] \right)^{ \frac{ 1 }{ p } } \le c_2 \left[ \sum{ \left| a_i \right|^2 } \right]^\frac{ 1 }{ 2 } }$

where c1 and c2 are constants dependent only on p.

For p ≥ 1, $\displaystyle{ c_2 \le c_1 \sqrt{ p }. }$

### Tomaszewski’s conjecture

In 1986, Bogusław Tomaszewski proposed a question about the distribution of the sum of independent Rademacher variables. A series of works on this question culminated into a proof in 2020 by Nathan Keller and Ohad Klein of the following conjecture.

Conjecture. Let $\displaystyle{ X = \sum_{i=1}^n a_i X_i }$, where $\displaystyle{ a_1^2 + \cdots + a_n^2 = 1 }$ and the $\displaystyle{ X_i }$'s are independent Rademacher variables. Then

$\displaystyle{ \Pr[|X| \leq 1] \geq 1/2. }$

For example, when $\displaystyle{ a_1 = a_2 = \cdots = a_n = 1/\sqrt{n} }$, one gets the following bound, first shown by Van Zuijlen.

$\displaystyle{ \Pr \left( \left| \frac{ \sum_{ i = 1 }^n X_i } { \sqrt n } \right| \le 1 \right) \ge 0.5. }$

The bound is sharp and better than that which can be derived from the normal distribution (approximately Pr > 0.31).

## Applications

The Rademacher distribution has been used in bootstrapping.

The Rademacher distribution can be used to show that normally distributed and uncorrelated does not imply independent.

Random vectors with components sampled independently from the Rademacher distribution are useful for various stochastic approximations, for example:

• The Hutchinson trace estimator, which can be used to efficiently approximate the trace of a matrix of which the elements are not directly accessible, but rather implicitly defined via matrix-vector products.
• SPSA, a computationally cheap, derivative-free, stochastic gradient approximation, useful for numerical optimization.

Rademacher random variables are used in the Symmetrization Inequality.

## Related distributions

• Bernoulli distribution: If X has a Rademacher distribution, then $\displaystyle{ \frac{X+1}{2} }$ has a Bernoulli(1/2) distribution.
• Laplace distribution: If X has a Rademacher distribution and Y ~ Exp(λ) is independent from X, then XY ~ Laplace(0, 1/λ).