Observed information
From Wikipedia, the free encyclopedia
λIn statistics, the Observed Information is minus the second deriviative of the log-likelihood.
[edit] Definition
Suppose we observe random variables , independent and identically distributed with density f(X; θ), where θ is a (possibly unknown) vector. Then the log-likelihood for the data is
- .
We define the Observed Information Matrix at θ * as
[edit] Fisher Information
If is the Fisher Information, then
- .