Dominated convergence theorem

From Wikipedia, the free encyclopedia

In mathematics, Lebesgue's dominated convergence theorem states that if a sequence { fn : n = 1, 2, 3, ... } of real-valued measurable functions on a measure space S converges almost everywhere, and is "dominated" (explained below) by some nonnegative function g in L1, then

\int_S\lim_{n\rightarrow\infty} f_n=\lim_{n\rightarrow\infty}\int_S f_n.

It is proven using Fatou's lemma.

To say that the sequence is "dominated" by g means that

|f_n(x)| \leq g(x)

for every n and almost every x (i.e., the measure of the set of exceptional values of x is zero). By g in L1, we mean

\int_S\left|g\right|<\infty.

So the theorem provides a sufficient condition under which integration and passing to the pointwise limit commute. The theorem applies also to measurable functions with values in a Banach space, with the dominating function still being non-negative and integrable as above.

That the assumption that the sequence is dominated by some g in L1 can not be dispensed with may be seen as follows: let fn(x) = n if 0 < x < 1/n and fn(x) = 0 otherwise. Any g which dominates the sequence must also dominate h given by h(x) = supn fn(x) for x > 0 (and 0 otherwise). Since

\int_0^1 h(x)\,dx = \infty,.

The order property of the Lebesgue integral tells us that there exists no function in L1 which dominate the sequence. A direct calculation shows integration and point-wise limit do not commute for this sequence:

\int_0^1\lim_{n\rightarrow\infty} f_n(x)\,dx=0\neq 1=\lim_{n\rightarrow\infty}\int_0^1 f_n(x)\,dx.

In contrast, Lebesgue's monotone convergence theorem does not require a sequence being dominated by an integrable g and instead assumes the given sequence is monotone. It hence states:

\lim_{k\to\infty} \int f_k(x)\,d\mu(x) = \int\lim_{k\to\infty} f_k(x)\,d\mu(x)

whenever {fn} is a monotonically increasing sequence of non-negative measurable functions.