Constraint (information theory)

From Wikipedia, the free encyclopedia

Constraint in information theory refers to the degree of statistical dependence between or among variables.

See Mutual Information, Total Correlation, and Interaction information. Garner (1962) provides a thorough discussion of various forms of constraint (internal constraint, external constraint, total constraint) with application to pattern recognition and psychology.

[edit] References

  • Garner W R (1962). Uncertainty and Structure as Psychological Concepts, John Wiley & Sons, New York.
In other languages