Hamming distance

From Wikipedia, the free encyclopedia

In information theory, the Hamming distance between two strings of equal length is the number of positions for which the corresponding symbols are different. Put another way, it measures the number of substitutions required to change one into the other, or the number of errors that transformed one string into the other.

For example:

  • The Hamming distance between 1011101 and 1001001 is 2.
  • The Hamming distance between 2143896 and 2233796 is 3.
  • The Hamming distance between "toned" and "roses" is 3.

Contents

[edit] Special properties

For a fixed length n, the Hamming distance is a metric on the vector space of the words of that length, as it obviously fulfills the conditions of non-negativity, identity of indiscernibles and symmetry, and it can be shown easily by complete induction that it satisfies the triangle inequality as well. The Hamming distance between two words a and b can also be seen as the Hamming weight of ab for an appropriate choice of the − operator.

For binary strings a and b the Hamming distance is equivalent to the number of ones in a xor b. The metric space of length-n binary strings, with the Hamming distance, is known as the Hamming cube; it is equivalent as a metric space to the set of distances between vertices in a hypercube graph. One can also view a binary string of length n as a vector in Rn, by treating each symbol in the string as a real coordinate; with this embedding, the strings form the vertices of an n-dimensional measure polytope, and the Hamming distance of the strings is equivalent to the Manhattan distance between the vertices.

[edit] History and applications

The Hamming distance is named after Richard Hamming, who introduced it in his fundamental paper about error-detecting and error-correcting codes. It is used in telecommunication to count the number of flipped bits in a fixed-length binary word as an estimate of error, and therefore is sometimes called the signal distance. Hamming weight analysis of bits is used in several disciplines including information theory, coding theory, and cryptography. However, for comparing strings of different lengths, or strings where not just substitutions but also insertions or deletions have to be expected, a more sophisticated metric like the Levenshtein distance is more appropriate.

[edit] References

Adapted in part from Federal Standard 1037C.

Richard W. Hamming. Error-detecting and error-correcting codes, Bell System Technical Journal 29(2):147-160, 1950.

[edit] See also