Private biometrics

A form of biometrics, also called Biometric Encryption or BioCryptics, in which the prover is protected against the misuse of template data by a dishonest verifier.

Biometric identification requires that a verifier searches for matches in a data base that contains data about the entire population. This introduces the security and privacy threat that the verifier who steals biometric templates from some (or even all) persons in the data base can perform impersonation attacks. When a private verification system is used on a large scale, the reference data base has to be made available to many different verifiers, who, in general, cannot be trusted. Information stolen from a data base can be misused to construct artificial biometrics to impersonate people. Creation of artificial biometrics is possible even if only part of the template is available.

To develop an insight in the security aspects of biometrics, one can distinguish between verification and private verification. In a typical verification situation, access to the reference template allows a malicious verifier to artificially construct measurement data that will pass the verification test, even if the prover has never exposed herself to a biometric measurement after the enrollment.

In private verification, the reference data should not leak relevant information to allow the verifier to (effectively) construct valid measurement data. Such protection is common practice for storage of computer passwords. When a computer verifies a password, it does not compare the password typed by the user with a stored reference copy. Instead, the password is processed by a cryptographic one-way function F and the outcome is compared against a locally stored reference string F(y ). So y is only temporarily available on the system hardware, and no stored data allows calculation of y. This prevents attacks from the inside by stealing unencrypted or decryptable secrets.

Comparison with handling computer passwords

The main difference between password checking and biometric private verification is that during biometric measurements it is unavoidable that noise or other aberrations occur. Noisy measurement data are quantized into discrete values before these can be processed by any cryptographic function. Due to external noise, the outcome of the quantization may differ from experiment to experiment. In particular if one of the biometric parameters has a value close to a quantization threshold, minor amounts of noise can change the outcome. Minor changes at the input of a cryptographic function are amplified and the outcome will bear no resemblance to the expected outcome. This property, commonly referred to as ‘confusion’ and ‘diffusion’, makes it less trivial to use biometric data as input to a cryptographic function. The notion of near matches or distance between enrollment and operational measurements vanishes after encryption or any other cryptographically strong operation. Hence, the comparison of measured data with reference data can not be executed in the encrypted domain without prior precautions to contain the effect of noise.

Meanwhile, it is important to realize that protection of the reference data stored in a database is not a complete solution to the above-mentioned threats. After having had an opportunity to measure operational biometric data, a dishonest verifier uses these measurement data. This can happen without anyone noticing it: Victor grabs the fingerprint image left behind on a sensor. This corresponds to grabbing all keystrokes including the plain passwords typed by a user.

References

28–36, 1999.

This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.