Benford's law
From Wikipedia, the free encyclopedia
- For the tongue-in-cheek "law" about controversy, see Benford's law of controversy.
Benford's law, also called the first-digit law, states that in lists of numbers from many real-life sources of data, the leading digit is 1 almost one third of the time, and larger numbers occur as the leading digit with less and less frequency as they grow in magnitude, to the point that 9 is the first digit less than one time in twenty.
This counter-intuitive result applies to a wide variety of figures, including electricity bills, street addresses, stock prices, population numbers, death rates, lengths of rivers, physical and mathematical constants, and processes described by power laws (which are very common in nature).
It is named after physicist Frank Benford, who stated it in 1938, although it had been previously stated by Simon Newcomb in 1881. The first rigorous formulation and proof appears to be due to Theodore P. Hill in 1988.
Contents |
[edit] Mathematical statement
More precisely, Benford's law states that the leading digit d (d ∈ {1, ..., b − 1} ) in base b (b ≥ 2) occurs with probability proportional to logb(d + 1) − logb(d). This quantity is exactly the space between d and d + 1 in a log scale.
In base 10, the leading digits have the following distribution by Benford's law:
Leading digit | Probability |
---|---|
1 | 30.1% |
2 | 17.6% |
3 | 12.5% |
4 | 9.7% |
5 | 7.9% |
6 | 6.7% |
7 | 5.8% |
8 | 5.1% |
9 | 4.6% |
One can also formulate a law for the first two digits: the probability that the first two-digit block is equal to n (n = 10, ..., 99) is log10(n+1) − log10(n), and similarly for three-blocks without leading zeros and longer blocks (in fact, the BL for the first p digits in base b follows immediately from the BL for a single leading digit in base b p).
[edit] Explanation
The law can be explained by the fact that if the first digits have a particular distribution, it must be independent of the measuring system used. Specifically, this means that if one converts from e.g. feet to yards (multiplication by a constant), the distribution is unchanged — it is scale invariant, and thus logarithmic.
For example, the first (non-zero) digit of the lengths or distances of objects should have the same distribution whether the unit of measurement is feet, yards, or anything else. But there are three feet in a yard, so the probability that the first digit of a length in yards is 1 must be the same as the probability that the first digit of a length in feet starts 3, 4, or 5. Applying this to all possible measurement scales gives a logarithmic distribution, and combined with the fact that log10(1)=0 and log10(10)=1 gives Benford's law. That is, if there is a distribution of first digits, it must apply to a set of data regardless of what measuring units are used, and the only distribution of first digits that fits that is the Benford Law.
Perhaps somewhat more precisely, suppose (capital) X is a random variable whose probability of being equal to any positive integer (lower-case) x is a constant times x−s, where s > 1. The aforementioned "constant" must then be 1/ζ(s), where ζ is the Riemann zeta function (see zeta distribution). The probability that the first digit of X is n approaches log10(n + 1) − log10(n) as s approaches 1.
The precise form of Benford's law can be explained if one assumes that the logarithms of the numbers are uniformly distributed; this means that a number is for instance just as likely to be between 100 and 1000 (logarithm between 2 and 3) as it is between 10,000 and 100,000 (logarithm between 4 and 5). For many sets of numbers, especially ones that grow exponentially such as incomes and stock prices, this is a reasonable assumption.
Note that for numbers drawn from many distributions, for example IQ scores, human heights or other variables following normal distributions, the law is not valid. However, if one "mixes" numbers from those distributions, for example by taking numbers from newspaper articles, Benford's law reappears. This can be proven mathematically: if one repeatedly "randomly" chooses a probability distribution and then randomly chooses a number according to that distribution, the resulting list of numbers will obey Benford's law (Hill, 1998).
[edit] Applications and limitations
In 1972, Hal Varian suggested that the law could be used to detect possible fraud in lists of socio-economic data submitted in support of public planning decisions. Based on the plausible assumption that people who make up figures tend to distribute their digits fairly uniformly, a simple comparison of first-digit frequency distribution from the data with the expected distribution according to Benford's law ought to show up any anomalous results.
In the same vein, Benford's law can be (and is) used to analyse insurance, accounting or expenses data and identify possible fraud as well as pricing strategies (el Sehity, Hoelzl & Kirchler, 2005) .
Other uses, for example to analyse the results of clinical trials and election results, have also been proposed.
[edit] Limitations
Care must be taken with these applications, however. A set of real-life data may not obey the law, depending on the extent to which the distribution of numbers it contains are skewed by the category of data.
For instance, one might expect a list of numbers representing 'populations of UK villages beginning with A' or 'small insurance claims' to obey Benford's law. But if it turns out that the definition of a 'village' is 'settlement with population between 300 and 999', or that the definition of a 'small insurance claim' is 'claim between $50 and $100', then Benford's law would be manifestly false because certain numbers have been excluded by the definition.
Benford's law does not apply when the spread of numbers being considered is less than an order of magnitude. For example, the first digit of the height of adult humans in feet will not follow Benford's law, since the digit 1 is significantly underrepresented; while with the height of adult humans in meters, the digit 1 is significantly overrepresented.
[edit] History
The discovery of this fact goes back to 1881, when the American astronomer Simon Newcomb noticed that the first pages of logarithm books (used at that time to perform calculations), the ones containing numbers that started with 1, were much more worn than the other pages. It has been argued that any book that is used from the beginning would show more wear and tear on the earlier pages, but also that Newcomb would have been referring to dirt on the pages themselves (rather than the edges) where people ran their fingers down the lists of digits to find the closest number to the one they required.
However, logarithm books did contain more than one list, with both logarithms and antilogarithms present, and sometimes many other tables as well, including exponentials, roots, sines, cosines, tangents, secants, cosecants etc. Thus, this story may be apocryphal. However, Newcomb's published result is the first known instance of this observation and includes a distribution on the second digit, as well. Newcomb proposed a law that the probability of a single number being the first digit of a number (let such a first digit be N) was equal to log(N+1)
The phenomenon was rediscovered in 1938 by the physicist Frank Benford, who checked it on a wide variety of data sets and was credited for it. In 1996, Ted Hill proved the result about mixed distributions mentioned above.
[edit] Popular culture
Benford's law was used as a plot device on CBS's TV series NUMB3RS in the episode "The Running Man".
[edit] References
- Frank Benford: The law of anomalous numbers, Proceedings of the American Philosophical Society, Vol. 78, No. 4, (March 1938), pp. 551–572
- Theodore P. Hill: The first digit phenomenon, American Scientist, Vol. 86 (July-August 1998), p. 358 10pg pdf file
- Simon Newcomb: Note on the frequency of use of the different digits in natural numbers, American Journal of Mathematics, Vol. 4, No. 1/4 (1881), pp. 39–40 [1] (subscription required)
- Hal Varian: Benford's law, American Statistician, Vol. 26, p. 65
- el Sehity, T.; Hoelzl, E.; and Kirchler, E. (2005). Price developments after a nominal shock: Benford’s Law and psychological pricing after the euro introduction. International Journal of Research in Marketing, 22 (4), pp. 471–480
[edit] External links
- Free Java Tool To Analyze Data Using Benford's Law by John Morrow
- Benford's Law and Zipf's Law at cut-the-knot
- Following Benford's Law, or Looking Out for No. 1
- A further five numbers: number 1 and Benford's law by Simon Singh
- A software which generates Benford subsets by Oktay Haracci
- A small flash application demonstrating Benford's Law, created by William Fawcett
- Looking out for number one by Jon Walthoe, Robert Hunt and Mike Pearson, plus Magazine, September 1999
- Eric W. Weisstein, Benford's Law at MathWorld.
- "Benford's Law" by Paul Niquette