Fisher information metric

From Wikipedia, the free encyclopedia

In mathematics, in information geometry, the Fisher information metric is a metric tensor for a statistical differentiable manifold. It can be used to calculate the informational difference between measurements. It takes the form:

g_{jk} = \int  \frac{\partial \log p(x,\theta)}{\partial \theta_j}  \frac{\partial \log p(x,\theta)}{\partial \theta_k}  p(x,\theta) dx.

Substituting i = − ln(p) from information theory, the formula becomes:

g_{jk} = \int  \frac{\partial i(x,\theta)}{\partial \theta_j}  \frac{\partial i(x,\theta)}{\partial \theta_k}  p(x,\theta) dx.

Which can be thought of intuitively as: "The distance between two points on a statistical differential manifold is the amount of information between them, i.e. the informational difference between them."

An equivalent form of the above equation is:

g_{jk} = -\int  \frac{\partial^2 i(x,\theta)}{\partial \theta_j \partial \theta_k}  p(x,\theta) dx = -\mathrm{E} \left[  \frac{\partial^2 i(x,\theta)}{\partial \theta_j \partial \theta_k} \right].

[edit] See also

[edit] References

  • Shun'ichi Amari - Differential-geometrical methods in statistics, Lecture notes in statistics, Springer-Verlag, Berlin, 1985.
  • Shun'ichi Amari, Hiroshi Nagaoka - Methods of information geometry, Transactions of mathematical monographs; v. 191, American Mathematical Society, 2000.