Hartley function
From Wikipedia, the free encyclopedia
The Hartley function is a measure of uncertainty, introduced by Ralph Hartley in 1928. If we pick a sample from a finite set A uniformly at random, the information revealed after we know the outcome is given by the Hartley function
If the base of the logarithm is 2, then the uncertainty is measured in bits. If it is the natural logarithm, then the unit is nats. It is also known as the Hartley entropy.
Contents |
[edit] The Hartley function as a special case of Shannon's entropy
The Hartley function is a special case of Shannon's entropy. Each element in the sample space A is associated with probability p = 1/|A|. For an element ω in A, the Hartley information of the event {ω} is -log(p) = log(|A|), which is constant over ω in A. The average information over the whole sample space is thus also equal to log(|A|).
[edit] Characterization of the Hartley function
The Hartley function only depends on the number of elements in a set, and hence can be viewed as a function on natural numbers. Rényi showed that the Hartley function in base 2 is the only function mapping natural numbers to real numbers that satisfies
- H(mn) = H(m) + H(n) (additivity)
- (monotonicity)
- H(2) = 1 (normalization)
Condition 1 says that the uncertainty of the Cartesian product of two finite sets A and B is the sum of uncertainties of A and B. Condition 2 says that larger set has larger uncertainty.
[edit] Derivation of the Hartley function
We want to show that the Hartley function, log2(n), is the only function mapping natural numbers to real numbers that satisfies
- H(mn) = H(m) + H(n) (additivity)
- (monotonicity)
- H(2) = 1 (normalization)
Let f be a function on positive integers that satisfies the above three properties. From the additive property, we can show that for any integer n and k,
- f(nk) = kf(n).
Let a, b, and t be any positive integers. There is a unique integer s determined by
Therefore,
and
On the other hand, by monotonicity,
Using Equation (1), we get
and
Hence,
Since t can be arbitrarily large, the difference on the left hand side of the above inequality must be zero,
So,
- f(a) = μlog2(a)
for some constant μ, which must be equal to 1 by the normalization property.
[edit] See also
This article incorporates material from Hartley function on PlanetMath, which is licensed under the GFDL.
This article incorporates material from Derivation of Hartley function on PlanetMath, which is licensed under the GFDL.