Talk:Dominated convergence theorem

From Wikipedia, the free encyclopedia

Let fk be the difference of a fixed non-negative function g of integral +\infty and a non-negative fuunction gk of integral -n. Thus

fk = ggk

Then fk is monotone, each integral

\int f_k

is defined (its value is +\infty,) but Lebesgue's monotone convergence theorem fails. In fact, \lim_k \int f_k =+ \infty but \int \lim_k f_k can be chosen to be any value. --CSTAR 01:30, 22 July 2006 (UTC)

ok, "...a non-negative fuunction gk of integral -n." ---that's interesting. Mct mht 04:32, 22 July 2006 (UTC)
OK a negative gk of integral -k. Do you need an example? --CSTAR 04:34, 22 July 2006 (UTC)
Hell, let me just it it to you: Consider R with Lebesgue measure and let g_k be -1 on [0,k]. Take g to to be some non-negstive non-integrable function on [-∞, 0]. For example g(x)=1/|x| for x in [-1,0] and 0 elsewhere.--CSTAR 04:41, 22 July 2006 (UTC)


my apology to User:Loisel for the revert. he's right, monotone increasing and positive is needed. i believe the positive assumption can be weakened to real valued if the elements of a sequence all have support in some set of finite measure and the first element f1 is integrable. monotone decreasing is taken care of by DCT. as for the above e.g., the point seems to be (?) one needs the elements of a sequence to be integrable. well, f1 is not integrable, so it's a sequence of non integrable functions. so the convergence theorems either don't apply (DCT) or the LHS and RHS both diverge (monotone conv. thm.), trivially. (what is meant by "...\int \lim_k f_k can be chosen to be any value"? that's not true for the sequence given.) Mct mht 05:52, 22 July 2006 (UTC)

What is meant is, given C an arbitrary finite value, then by appropriately choosing gk, the resulting sequence can be chosen so that
\int \lim_k f_k
is defined and has the value C. No it isn't true for the example I gave, since the pointwise limit \lim_k f_k doesn't have a well-defined integral.
However, define
g(x) = 1/x \quad \mbox{ if } x \in [0, 1], \quad g(x) = 0 \mbox{ otherwise }
and
g_k(x) = -1/x +C \quad \mbox{ if } x \in [1/k, 1], \quad g_k(x) = 0 \mbox{ otherwise }
Then
fk(x) = g(x) + gk(x)
converges a.e to C, whereas
\forall k, \int f_k(x) dx = + \infty
--CSTAR 06:14, 22 July 2006 (UTC)


There seems to be an inconsistency in the last statement of this article, that either theorem, dominated convergence theorem or monotone convergence theorem, can be shown as a corollary of the Fatou lemma. In the Wikipedia article on the Fatou lemma one reads:

Fatou's lemma is proved using the monotone convergence theorem, and can be used to prove the dominated convergence theorem.

If indeed Fatou's lemma is proved on the basis of the monotone convergence theorem, it follows that the latter theorem cannot have been proved on the basis of Fatou's lemma. It is important to clarify the matter; it may be that the word or is meant to be significant here, however the message, if indeed there, does not come out very well.

--BF 14:19, 8 December 2006 (UTC)

either monotone convergence OR fatou lemma, must first be proved by itself. then it is possible to easily prove the other using the first one (it goes both ways). personally i think it is more natural to prove monotone convergence first (the proof is more intuitive). then use fatou lemma to prove dominated convergence. --itaj 19:16, 30 March 2007 (UTC)