Talk:Noisy-channel coding theorem

From Wikipedia, the free encyclopedia

This article should be moved to Noisy-channel coding theorem. The hyphen becomes necessary when "noisy channel" is used as an adjective. Otherwise we descrive the theorem itself as noisy, which is certainly not the intent here. -- 130.94.162.61 16:15, 19 January 2006 (UTC)

Contents

[edit] Copyediting error in page title

I agree with the comment above. "Noisy channel" should be hyphenated when used as an adjective (as it is, inconsistently, in the body text). 38.113.17.3 19:35, 12 October 2006 (UTC)VP

[edit] Error in noisy-channel coding theorem

Before the change I just made, the statement of the theorem implied that when rate equaled capacity, error could never be made arbitrarily small. This is clearly wrong; a lossless channel can achieve its capacity rate and, in spite of its being somewhat degenerate, does fall within this framework. The 1 December 2005 "fix" was wrong (though what it attempted to correct was also wrong). I've fixed this so that it's clear that the R=C case is not addressed in the noisy-channel coding theorem, but someone might want to double-check my wording on this (which is also in Shannon–Hartley theorem) and redraw the math PNG. Calbaer 23:25, 25 April 2006 (UTC)

[edit] Attribution to 1948 Shannon

Theorem 10 of Shannon's 1948 paper corresponds to the noisy channel coding theorem, but this only has part 1 of the theorem as presented here. Could someone double-check this and re-attribute accordingly? Calbaer 22:36, 31 August 2006 (UTC)

"Theorem 11: Let a discrete channel have the capacity C and a discrete source the entropy per second H. If H ≤ C there exists a coding system such that the output of the source can be transmitted over the channel with an arbitrarily small frequency of errors (or an arbitrarily small equivocation). If H > C it is possible to encode the source so that the equivocation is less than H -C+ ε where ε is arbitrarily small. There is no method of encoding which gives an equivocation less than H - C".
Shannon's H becomes our R, Shannon's equivocation becomes our HR, and thus Shannon's statement is equivalent to our parts 1, 2 and 3 Jheald 23:36, 31 August 2006 (UTC).

Note that Shannon did not provide a formal proof. That was finally done by Feinstein for the coding part, and Fano for the converse part. I added the references. Please check. Wullj 27 December 2006.

[edit] Merge with Shannon–Hartley theorem article?

There is a lot of overlap between this article and the Shannon–Hartley theorem. Should these pages be merged? Please contribute your thoughts on the Talk:Shannon–Hartley theorem discussion page. -- technopilgrim 20:29, 30 October 2006 (UTC)

[edit] What about non-discrete channels?

I'm a student civil engineering, and in information coding class, we saw a variation of the shannon theorem. It states that, on a non-discrete memory-less channel, the shannon capacity [math]C_{shannon} = B log_2(1 + P_u / P_n)[/math]. Can this be added to the article?

Take a look at Shannon-Hartley theorem. And the merge discussion mentioned above.--LutzL 15:53, 28 November 2006 (UTC)