Pointwise mutual information
From Wikipedia, the free encyclopedia
Pointwise mutual information (PMI) (or specific mutual information) is a measure of association used in information theory and statistics.
The PMI of a pair of outcomes x and y belonging to discrete random variables quantifies the discrepancy between the probability of their coincidence given their joint distribution versus the probability of their coincidence given only their individual distributions and assuming independence. Mathematically,
The mutual information of X and Y is the expected value of the Specific Mutual Information of all possible outcomes.
The measure is symmetric (SI(x,y) = SI(y,x).) It is zero if X and Y are independent, and equal to -log(p(x)) if X and Y are perfectly associated. Finally, SI(x,y) will increase if p(x|y) is fixed, but p(x) decreases.
[edit] External links
- Demo at Rensselaer MSR Server (PMI values normalized to be between 0 and 1) - this site appears to be down at the moment (29th May 2008)