Chernoff bounds

From Wikipedia, the free encyclopedia

Herman Chernoff described a method on how to derive bounds on sequences of random variables. All the bounds derived using his method are now known as Chernoff bounds.

The following Theorem is due to Wassily Hoeffding.

[edit] Theorem

Assume random variables X_1, X_2, \ldots, X_n are i.i.d. Let p = E \left [X_i \right ], X_i \in [0,1], and \varepsilon > 0. Then

\Pr\left[ \frac 1 n \sum X_i \geq p + \varepsilon \right] \leq \left ( {\left (\frac{p}{p + \varepsilon}\right )}^{p+\varepsilon} {\left (\frac{1 - p}{1 -p - \varepsilon}\right )}^{1 - p- \varepsilon}\right ) ^n =  e^{ - D(p+\varepsilon\|p) n}

and

\Pr\left[ \frac 1 n \sum X_i \leq p - \varepsilon \right] \leq \left ( {\left (\frac{p}{p - \varepsilon}\right )}^{p-\varepsilon} {\left (\frac{1 - p}{1 -p + \varepsilon}\right )}^{1 - p+ \varepsilon}\right ) ^n = e^{ - D(p-\varepsilon\|p) n},

where

D(x||y) = x \log \frac{x}{y} + (1-x) \log \frac{1-x}{1-y}

is the Kullback-Leibler divergence between Bernoulli random variables with parameters x and y respectively.

A simpler bound follows by relaxing the theorem using D( p + x \| p) \geq 2 x^2. This results in a special case of Hoeffding's inequality. Sometimes, the bound D( (1+x) p \| p) \geq x^2 p/4 for -1/2 \leq x \leq 1/2, which is stronger for p < 1 / 8, is also used.

Rudolf Ahlswede and Andreas Winter introduced a Chernoff bound for matrix-valued random variables.

[edit] See also

[edit] References

  • Herman Chernoff, A measure of asymptotic efficiency for tests of a hypothesis based on the sum of observations, Annals of Mathematical Statistics, vol. 23, pp. 493–507, 1952.
  • Wassily Hoeffding, Probability inequalities for sums of bounded random variables, Journal of the American Statistical Association 58 (301): 13–30, March 1963. (JSTOR)
  • Rudolf Ahlswede and Andreas Winter, "Strong Converse for Identification via Quantum Channels" http://www.arxiv.org/abs/quant-ph/0012127
In other languages