Fisher information metric
From Wikipedia, the free encyclopedia
In information geometry, the Fisher information metric is a particular Riemannian metric which can be defined on a smooth statistical manifold, i.e., a smooth manifold whose points are probability measures defined on a common probability space.
It can be used to calculate the informational difference between measurements. It takes the form:
Substituting i = − ln(p) from information theory, the formula becomes:
Which can be thought of intuitively as: "The distance between two points on a statistical differential manifold is the amount of information between them, i.e. the informational difference between them."
An equivalent form of the above equation is:
[edit] See also
- Cramér-Rao bound
- Fisher information
- Bures metric
[edit] References
- Shun'ichi Amari - Differential-geometrical methods in statistics, Lecture notes in statistics, Springer-Verlag, Berlin, 1985.
- Shun'ichi Amari, Hiroshi Nagaoka - Methods of information geometry, Translations of mathematical monographs; v. 191, American Mathematical Society, 2000.