Nat (information)
From Wikipedia, the free encyclopedia
A nat (sometimes also nit or even nepit) is a logarithmic unit of information or entropy, based on natural logarithms and powers of e, rather than the powers of 2 and base 2 logarithms which define the bit. The nat is the natural unit for information entropy, corresponding to Boltzmann's constant for thermodynamic entropy.
When the Shannon entropy is written using a natural logarithm,
it is implicitly giving a number measured in nats.
One nat corresponds to about 1.44 bits (1/(log 2)), or 0.434 hartleys (1/(log 10)).
[edit] History
Alan Turing used the natural ban (Hodges 1983, Alan Turing: The Enigma.). Boulton and Wallace (1970) used the term nit in conjunction with minimum message length which was subsequently changed by the minimum description length community to nat to avoid confusion with the nit (unit) (Comley and Dowe, 2005, sec. 11.4.1, p271).
[edit] References
- J.W. Comley and D.L. Dowe, ``Minimum Message Length, MDL and Generalised Bayesian Networks with Asymmetric Languages, Chapter 11 (pp265-294) in P. Grünwald, I. J. Myung and M. A. Pitt (eds.), `Advances in Minimum Description Length: Theory and Applications', MIT Press (ISBN 0-262-07262-9), April 2005.
- Fazlollah M. Reza. An Introduction to Information Theory. New York: Dover 1994. ISBN 048668210