Information flow (information theory)

From Wikipedia, the free encyclopedia

Information flow in an information theoretical context is the transfer of information from a variable h to a variable l in a given process. The measure of information flow, p, is defined as the uncertainty before the process started minus the uncertainty after the process has terminated. This can be quantified as


H(h|l) - H(h|l')\ \stackrel{\mathrm{def}}{=}\  (H(h,l) - H(l)) - (H(h,l') - H(l'))\,\!


where H(h | l) is the conditional entropy (equivocation) of variable h (before the process started) given the variable l (before the process started), and H(h | l') is the conditional entropy (equivocation) of variable h (before the process started) given the variable l' (the value of variable l after the process finished).

H(X,Y) is the joint entropy, and can be calculated as follows:


H(X,Y) = -\sum_{x,y} p_{x,y} \log(p_{x,y}) \!