Talk:Information theory

From Wikipedia, the free encyclopedia

This article is within the scope of WikiProject Telecommunications, an attempt to build a comprehensive and detailed guide to telecommunications on Wikipedia. If you would like to participate, you can edit the article attached to this page, or visit the project page, where you can join the project as a "full time member" and/or contribute to the discussion.
This article is within the scope of WikiProject Computer science.
??? This article has not yet received a rating on the assessment scale.
??? This article has not yet received an importance rating on the assessment scale.
To-do list for Information theory: edit  · history  · watch  · refresh
  • Restructure according to guidelines in Wikipedia:WikiProject Science, ie. :
    • start with 2 or 3 paragraphs for the general public, using daily life examples
    • explain the applications of information theory in a separate section
    • create a separate section on the theory, based on the existing material
    • present the history in the final section

Contents

[edit] Pending tasks (what do we want?)

Here are how I interpret the pending tasks:

  • Make sure intro and overview are understandable with daily life examples. Making sure the reader can subjectively tell what a "bit" is, for example, is a good start. Discussion of elephant trumpets, whale songs, and black holes, do not contribute in this manner, although their mere mention (with links, say, to the no hair theorem) may be useful to the curious.
  • Move applications up before mathematics (just after Overview) and make it independent. I'm not sure this is for the best, as Overview states some applications and, for the rest, it's useful to know some theory. Also, we need to be careful not to conflate security and information theory, since information knowledge is necessary for, yet not sufficient for, breaking ciphers. (For example, a Diffie-Hellman key exchange assumes that an adversary will have full knowledge, but the form of that knowledge assures that the adversary will not be able to use it within a reasonable amount of time. Thus, although secure in practice, Diffie-Hellman key exchange is information-theoretically insecure).
  • Move history down to end (just before References). Again, I'm not sure why this is best, and some "good" articles, like Thermodynamics, list history first. If moved, we need to check that this is just as readable, but the reader doesn't need to know scientific history to understand scientific phenomena. Clearly more post-1948 history is needed. Most applications of information theory, even elementary ones, are post-1948 (Huffman codes, arithmetic codes, LDPC codes, Turbo codes), and much of the theory is too, e.g., information divergence, information inequalities, Fisher sufficient statistics, Kolmogorov complexity, network information theory, etc. However, it might be best to move the history to its own article. The current section is far too in-depth.

There's still a lot to do and a lot to add, and we might reasonably ask if certain topics should be shortened or omitted, e.g., Kullback–Leibler divergence (which, although important, can generally be omitted from elementary explanation), differential entropy (which is most important for transinformation, which can be interpreted as a limit of discrete transinformation), gambling (which most information theoretists I know love but is a somewhat fringe topic). Again, the thermodynamics article is shorter and far less mathematical. Do we want this article to be long with full mathematical explanations or medium-sized with general explanations and links to the math? I don't know the right answer, but it might be nice to get some ideas of what people would like out of this.

A few final things to keep in mind:

  • Diversions or examples absent of explanation confuse, not clarify.
  • Sentences should be concise as possible, but no more so, for easy reading.
  • Proofread before (or at least immediately after) changes.
  • Others might not read the same way as you.

Due to these factors, it's good to discuss significant changes on the talk page. I find my "peer edited" sections are a lot better than my first attempts. And too much unilateral change can result in an unstable, unreadable, and flat-out wrong article. Calbaer 01:44, 15 June 2006 (UTC)

I say keep the K–L divergence. Not only is it important, but it is really useful in understanding the mutual information. The treatment of other stuff like gambling, intelligence, and measure theory could be cut. The measure theory section shows a useful mnemonic device, but the analogy is incomplete due to the failure of the transinformation to remain non-negative in the case of three or more random variables. It could probably be cut out entirely. History of information theory could indeed have its own article, but keep a short summary of history in this article here. Also some mention of Gibbs' inequality would be nice as it shows the K–L divergence (and thus the bivariate mutual information) to be non-negative. The AEP could be mentioned in the source theory section, since it is a property of certain sources. Don't hesitate to add what you think should be added. --130.94.162.64 17:17, 20 June 2006 (UTC)
K-L divergence is certainly fundamental, so it should stay. Measure theory is also fundamental, but it is not necessary for a good understanding of information theory, so it could go. History could be summarized and moved. I'll get to this when I can, though if someone else wants to do so first, go ahead. Calbaer 18:36, 20 June 2006 (UTC)
the section that mentions measure theory should be kept, or moved somewhere else, rather than deleted. also, the following sentence in that section is misleading/incorrect: "it justifies, in a certain formal sense, the practice of calling Shannon's entropy a "measure" of information." entropy is not a measure, given an random variable f, one integrates f lnf against a suitable measure (the Lebesgue measure, the counting measure, etc) to get the entropy. Mct mht 18:52, 20 June 2006 (UTC)
I think that the section confuses more than it helps; most readers will not have a working knowledge of measure theory, and, as you say, the section needs at least a little reworking. I'll delete it and add a link to Information theory and measure theory. Calbaer 19:47, 20 June 2006 (UTC)

Seems to me a big priority should be to break out all the details into other articles. Information Theory is very much a science. If you look at the "Mathematics" Wikipedia article, you'll see that there are few actual mathematics concepts dicussed. Its more a history and categorization of the sub-sciences of mathematics, with links. I picture "Information Theory" as a fairly small article (only because the science is young) giving only a bit more than a dictionary definition of "information theory", and then a ton of links with brief descriptions to categories, related theories, practical applications, algorithms, coding methods, etc.. I think this article should be more of a starting point. Details should be elsewhere. Also, until this can be done, at least a set of links might be nice. I guess I'm saying here I'm surprised that a search of the "Information Theory" article in Wikipedia doesn't find the phrase "forward error correction"! qz27, 22 June 2006

Good point, and one that argues for the elimination of the following as all but links: Self-information, Kullback–Leibler divergence, differential entropy. We shouldn't be scared to have a little math in the article, but regular, joint, and conditional entropy can be defined together and mutual entropy in terms of them. That's enough for the idealizations of source and channel coding. If people want to know why the math is as it is, there could be a definitions in information theory article or something like that. Two things though: 1) "error correction" is mentioned three times in the article (and I don't think it's that important to specifically state FEC in a "starting point" article) and 2) are you volunteering to do this? Calbaer 06:13, 23 July 2006 (UTC)
I went through much of the article and tried to make it more approachable with links to unnecessary math. Even now it may have too much math, and someone should probably reorganize it accordingly. Calbaer 22:45, 24 July 2006 (UTC)

[edit] Game theory as a template?

Game theory was recently a featured article. Perhaps information theory should look more like that? Calbaer 03:52, 2 August 2006 (UTC)

[edit] Removed tag, added poached images, reworked header

The article has been largely dormant and those who have tagged it — not to mention have made many contributions — are no longer in the Wikipedia community with the same aliases. (One was kicked off in September; the other stopped contributing and/or changed IP address in June.) It's not a perfect article, but it's better, so I removed the tag. Any advice for how to improve it — by novice or by expert — would be appreciated, especially on this talk page. I feel some of the article is a bit too redundant, but it's better to be redundant than to omit or underplay basic facts. I added some images from other articles to visually illustrate the concepts, which will hopefully get people more interested in the subject and more quickly clear what it's all about. Finally, I made the disambiguation header shorter, updated (to reflect the library/information science split), and added informatics. By the way, my apologies for misstating that I thought "K-L divergence should stay." I meant that mutual information should stay. K-L divergence may be a bit too confusing for those who don't want to explore it as a separate topic. Calbaer 21:59, 25 October 2006 (UTC)

[edit] Entropy

In the entropy section the article says:

If \mathbb{M}\, is the set of all messages m that M could be, and p(m) = Pr(M = m), then M has

H(M) = \mathbb{E}_{M} [-\log p(m)] = -\sum_{m \in \mathbb{M}} p(m) \log p(m)

bits of entropy. An important property of entropy is that it is maximized when all the messages in the message space are equiprobable — i.e., most unpredictable — in which case H(M) = log | M | .

but if all states are equiprobable then p(m)=\frac{1}{|\mathbb{M}|} so

H(M)=-\sum_{m \in \mathbb{M}} p(m) \log p(m)=-\sum_{m \in \mathbb{M}} \frac{1}{|\mathbb{M}|} \log \frac{1}{|\mathbb{M}|}=\log|\mathbb{M}|

not log | M | . Thesm 08:52, 6 December 2006 (UTC)

Changed; it was a slight abuse of notation. Though for something like this, you can Wikipedia:Be bold. Worst thing that can happen is that you'd be reverted. Calbaer 17:26, 6 December 2006 (UTC)