File verification

File verification is the process of using an algorithm for verifying the integrity or authenticity of a computer file. This can be done by comparing two files bit-by-bit, but requires two copies of the same file, and may miss systematic corruptions which might occur to both files. A more popular approach is to also store checksums (hashes) (message digests) of files for later comparison.

Integrity verification

File integrity can be compromised, usually referred to as the file becoming corrupted. A file can become corrupted by a variety of ways: faulty storage media, errors in transmission, write errors during copying or moving, software bugs, and so on.

Hash-based verification ensures that a file has not been corrupted by comparing the file's hash value to a previously calculated value. If these values match, the file is presumed to be unmodified. Due to the nature of hash functions, hash collisions may result in false positives, but the likelihood of collisions is often negligible with random corruption.

Authenticity verification

It is often desirable to verify that a file hasn't been modified in transmission or storage by untrusted parties, for example, to include malicious code such as viruses or backdoors. To verify the authenticity, a classical hash function is not enough as they are not designed to be collision resistant; it is computationally trivial for an attacker to cause deliberate hash collisions, meaning that a malicious change in the file is not detected with by a hash comparison. In cryptography, this attack is called the collision attack.

For this purpose, cryptographic hash functions are employed often. As long as the hash sums cannot be tampered with for example, if they are communicated over a secure channel the files can be presumed to be intact. Alternatively, digital signatures can be employed to assure tamper resistance.

File formats

A checksum file is a small file that contains the checksums of other files.

There are a few well-known checksum file formats.[1]

Several utilities, such as md5deep, can use such checksum files to automatically verify an entire directory of files in one operation.

The particular hash algorithm used is often indicated by the file extension of the checksum file.

The ".sha1" file extension indicates a checksum file containing 160-bit-bit SHA-1 hashes in sha1sum format.

The ".md5" file extension, or a file named "MD5SUMS", indicates a checksum file containing 128-bit MD5 hashes in md5sum format.

The ".sfv" file extension indicates a checksum file containing 32-bit CRC32 checksums in simple file verification format.

The "crc.list" file indicates a checksum file containing 32-bit CRC checksums in brik format.

As of 2012, best practice recommendations is to use SHA-2 or SHA-3 to generate new file integrity digests; and to accept MD5 and SHA1 digests for backwards compatibility if stronger digests are not available. The theoretically weaker SHA1, the weaker MD5, or much weaker CRC were previously commonly used for file integrity checks. [2][3][4][5][6][7][8][9][10]

CRC checksums cannot be used to verify the authenticity of files, as CRC32 is not a collision resistant hash function -- even if the hash sum file is not tampered with, it is computationally trivial for an attacker to replace a file with the same CRC digest as the original file, meaning that a malicious change in the file is not detected by a CRC comparison.

Products

References

  1. "Checksum".
  2. NIST. "NIST's policy on hash functions". 2012.
  3. File Transfer Consulting. "Integrity".
  4. "Intrusion Detection FAQ: What is the role of a file integrity checker like Tripwire in intrusion detection?".
  5. Hacker Factor. "Tutorial: File Digest".
  6. Steve Mead. "Unique File Identification in the National Software Reference Library" p. 4.
  7. Del Armstrong. "An Introduction To File Integrity Checking On Unix Systems". 2003.
  8. "Cisco IOS Image Verification"
  9. Elizabeth D. Zwicky, Simon Cooper, D. Brent Chapman. "Building Internet Firewalls". p. 296.
  10. Simson Garfinkel, Gene Spafford, Alan Schwartz. "Practical UNIX and Internet Security". p. 630.

See also