Entropic vector
From Wikipedia, the free encyclopedia
The tone or style of this article or section may not be appropriate for Wikipedia. Specific concerns may be found on the talk page. See Wikipedia's guide to writing better articles for suggestions.(May 2008) |
This article or section may contain original research or unverified claims. Please improve the article by adding references. See the talk page for details. (May 2008) |
The entropic vector is a concept arising in information theory. Shannon's information entropy measures and their associated identities and inequalities (both constrained and unconstrained) have received a lot of attention over the past from the time Shannon introduced his concept of Information Entropy. A lot of inequalities and identities have been found and are available in standard Information Theory texts. But recent researchers have laid focus on trying to find all possible identities and inequalities (both constrained and unconstrained) on such entropies and characterize them. Entropic vector lays down the basic framework for such a study.
Contents |
[edit] Definition
Consider n jointly distributed random variables with a joint probability density function . Let α be a subset of . Now we define where . Clearly there are 2 n - 1 non-empty subsets of N. Corresponding to each α, we have the joint entropy defined as . A vector in consisting of as its elements for all non-empty subsets α of N. Such a vector is called an entropic vector.
[edit] Example
Let X,Y be 2 independent binary random variables with probability of each symbol as one-half. Then
Note that mutual information is then given by
This is because X and Y are independent. The entropic vector is thus
We note that is in as there exists random variables with the entries in the vector as its entropies.
[edit] Open problem
Given a vector , is it possible to say if there exists n random variables such that their joint entropies are given by v? It turns out that for n = 2,3 the problem has been solved. But for , it still remains unsolved. Defining the set of all such vectors that can be constructed from a set of n random variables as , we see that a complete characterization of this space remains an unsolved mystery.
[edit] References
- Thomas M. Cover, Joy A. Thomas. Elements of information theory New York: Wiley, 1991. ISBN 0-471-06259-6