Conditional entropy
From Wikipedia, the free encyclopedia
In information theory, the conditional entropy (or equivocation) quantifies the remaining entropy of a random variable Y given that the value of a second random variable X is known. It is referred to as the entropy of Y conditional on X, and is written H(Y | X). Like other entropies, the conditional entropy is measured in bits, [[nat]s, or hartleys.
Given random variables X and Y with entropies H(X) and H(Y), and with a joint entropy H(X,Y), the conditional entropy of Y given X is defined as:
-
- .
Intuitively, the combined system contains H(X,Y) bits of information. If we learn the value of X, we have gained H(X) bits of information, and the system has H(Y | X) bits remaining. H(Y | X) = 0 if and only if the value of Y is completely determined by the value of X. Conversely, H(Y | X) = H(Y) if and only if Y and X are independent random variables.
In quantum information theory, the conditional entropy is generalized to the conditional quantum entropy.