Dominated convergence theorem
From Wikipedia, the free encyclopedia
In mathematics, Lebesgue's dominated convergence theorem states that if a sequence { fn : n = 1, 2, 3, ... } of real-valued measurable functions on a measure space S converges almost everywhere, and is "dominated" (explained below) by some nonnegative function g in L1, then
It is proven using Fatou's lemma.
To say that the sequence is "dominated" by g means that
for every n and almost every x (i.e., the measure of the set of exceptional values of x is zero). By g in L1, we mean
So the theorem provides a sufficient condition under which integration and passing to the pointwise limit commute. The theorem applies also to measurable functions with values in a Banach space, with the dominating function still being non-negative and integrable as above.
That the assumption that the sequence is dominated by some g in L1 can not be dispensed with may be seen as follows: let fn(x) = n if 0 < x < 1/n and fn(x) = 0 otherwise. Any g which dominates the sequence must also dominate h given by h(x) = supn fn(x) for x > 0 (and 0 otherwise). Since
- .
The order property of the Lebesgue integral tells us that there exists no function in L1 which dominate the sequence. A direct calculation shows integration and point-wise limit do not commute for this sequence:
In contrast, Lebesgue's monotone convergence theorem does not require a sequence being dominated by an integrable g and instead assumes the given sequence is monotone. It hence states:
whenever {fn} is a monotonically increasing sequence of non-negative measurable functions.