Chernoff bounds
From Wikipedia, the free encyclopedia
Herman Chernoff described a method on how to derive bounds on sequences of random variables. All the bounds derived using his method are now known as Chernoff bounds.
The following Theorem is due to Wassily Hoeffding.
[edit] Theorem
Assume random variables are i.i.d. Let , , and . Then
and
where
is the Kullback-Leibler divergence between Bernoulli random variables with parameters x and y respectively.
A simpler bound follows by relaxing the theorem using . This results in a special case of Hoeffding's inequality. Sometimes, the bound for , which is stronger for p < 1 / 8, is also used.
Rudolf Ahlswede and Andreas Winter introduced a Chernoff bound for matrix-valued random variables.
[edit] See also
- Chernoff's inequality, Chernoff bound: special cases
- Hoeffding's inequality
- Markov's inequality
- Chebyshev's inequality
[edit] References
- Herman Chernoff, A measure of asymptotic efficiency for tests of a hypothesis based on the sum of observations, Annals of Mathematical Statistics, vol. 23, pp. 493–507, 1952.
- Wassily Hoeffding, Probability inequalities for sums of bounded random variables, Journal of the American Statistical Association 58 (301): 13–30, March 1963. (JSTOR)
- Rudolf Ahlswede and Andreas Winter, "Strong Converse for Identification via Quantum Channels" http://www.arxiv.org/abs/quant-ph/0012127