Khintchine inequality

From Wikipedia, the free encyclopedia

In mathematics, the Khintchine inequality, named after Aleksandr Khinchin and spelled in multiple ways in the Roman alphabet, is a theorem from probability, and is also frequently used in analysis. Heuristically, it says that if we pick N complex numbers  x_1,...,x_N \in\mathbb{C}, and add them together each multiplied by a random sign \pm 1 , then the expected value of its size, or the size it will be closest to on average, will be not too far off from  \sqrt{|x_1|^{2}+\cdots + |x_N|^{2}}.

[edit] Statement of theorem

Let  \{\epsilon_{n}\}_{n=1}^{N} be i.i.d. random variables with P(\epsilon_n=\pm1)=\frac12 for every n=1\ldots N, i.e., a Rademacher sequence. Let  0<p<\infty and let  x_1,...,x_N\in \mathbb{C}. Then

 A_p \left( \sum_{n=1}^{N}|x_{n}|^{2} \right)^{\frac{1}{2}} \leq \left(\mathbb{E}\Big|\sum_{n=1}^{N}\epsilon_{n}x_{n}\Big|^{p} \right)^{1/p}  \leq B_p \left(\sum_{n=1}^{N}|x_{n}|^{2}\right)^{\frac{1}{2}}

for some constants Ap,Bp > 0 depending only on p (see Expected value for notation).

[edit] Uses in analysis

The uses of this inequality are not limited to applications in probability theory. One example of its use in analysis is the following: if we let T be a linear operator between two Lp spaces Lp(X,μ) and Lp(Y,ν), 1\leq p<\infty, with bounded norm  \|T\|<\infty , then one can use Khinchine's inequality to show that

 \left\|\left(\sum_{n=1}^{N}|Tf_n|^{2} \right)^{\frac{1}{2}}\right\|_{L^p(Y,\nu)}\leq C_p\left\|\left(\sum_{n=1}^{N}|f_{n}|^{2}\right)^{\frac{1}{2}}\right\|_{L^p(X,\mu)}

for some constant Cp > 0 depending only on p and \|T\|.

[edit] References

  • Thomas H. Wolff, "Lectures on Harmonic Analysis". American Mathematical Society, University Lecture Series vol. 29, 2003. ISBN 0-8218-3449-5