Khintchine inequality

From Wikipedia, the free encyclopedia

In mathematics, the Khintchine inequality, named after Aleksandr Khinchin and spelled in multiple ways in the Roman alphabet, is a theorem from probability, and is also frequently used in analysis. Heuristically, it says that if we pick N complex numbers x_1,...,x_N \in\mathbb{C}, and add them together each multiplied by a random sign \pm 1, then the expected value of its size, or the size it will be closest to on average, will be not too far off from \sqrt{|x_1|^{2}+\cdots + |x_N|^{2}}.

[edit] Statement of theorem

Let \{w_{n}\}_{n=1}^{N} be independent random variables taking values of \pm 1, 0<p<\infty, and x_1,...,x_N\in \mathbb{C}. Then

A_p ( \sum_{n=1}^{N}|x_{n}|^{2})^{\frac{1}{2}} \leq (\mathbb{E}|\sum_{n=1}^{N}w_{n}x_{n}|^{p})^{1/p}  \leq B_p (\sum_{n=1}^{N}|x_{n}|^{2})^{\frac{1}{2}}

for some constants Ap,Bp > 0 depending only on p (see Expected value for notation).

[edit] Uses in analysis

The uses of this inequality are not limited to applications in probability theory. One example of its use in analysis is the following: if we let T be a linear operator between two Lp spaces Lp(X,μ) and Lp(Y,ν), 1\leq p<\infty, with bounded norm ||T||<\infty, then one can use Khinchine's inequality to show that

||(\sum_{n=1}^{N}|Tf_n|^{2})^{\frac{1}{2}}||_{L^p(Y,\nu)}\leq C_p||(\sum_{n=1}^{N}|f_{j}|^{2})^{\frac{1}{2}}||_{L^p(X,\mu)}

for some constant Cp > 0 depending only on p and | | T | | .

[edit] References

  • Thomas H. Wolff, "Lectures on Harmonic Analysis". American Mathematical Society, University Lecture Series vol. 29, 2003. ISBN 0-8218-3449-5