Information diagram

An information diagram is a type of Venn diagram used in information theory to illustrate relationships among Shannon's basic measures of information: entropy, joint entropy, conditional entropy and mutual information.[1][2] An information diagram is a very useful pedagogical tool for teaching and learning about these basic measures of information, but using such a diagram carries some non-trivial implications. For example, Shannon's entropy in the context of an information diagram must be taken as a signed measure. (See the article Information theory and measure theory for more information.)

Random variables X, Y, and Z are said to form a Markov chain if Z and X are independent given Y. Thus, Y contains all the information of X that is relevant to Z. Thus, knowing X does not add anything to one's knowledge of Z when Y is given.

References

  1. ^ Fazlollah Reza. An Introduction to Information Theory. New York: McGraw-Hill 1961. New York: Dover 1994. ISBN 0-486-68210-2
  2. ^ R. W. Yeung, A First Course in Information Theory. Norwell, MA/New York: Kluwer/Plenum, 2002.