Physical unclonable function
Physical unclonable function (PUF, sometimes also called physically unclonable function) is a physical entity that is embodied in a physical structure and is easy to evaluate but hard to predict. Further, an individual PUF device must be easy to make but practically impossible to duplicate, even given the exact manufacturing process that produced it. In this respect it is the hardware analog of a one-way function. The name "physical unclonable function" might be a little misleading as some PUFs are clonable, and most PUFs are noisy and therefore do not achieve the requirements for a function.
History
Early references that exploit the physical properties of disordered systems for authentication purposes date back to Bauder in 1983[1] and Simmons in 1984.[2][3] Naccache and Frémanteau provided an authentication scheme in 1992 for memory cards.[4] The terms POWF (physical one-way function) and PUF (physical unclonable function) were coined in 2001[5] and 2002,[6] the latter publication describing the first integrated PUF where unlike PUFs based on optics, the measurement circuitry and the PUF are integrated onto the same electrical circuit (and fabricated on silicon).
Concept
Rather than embodying a single cryptographic key, PUFs implement challenge–response authentication. When a physical stimulus is applied to the structure, it reacts in an unpredictable (but repeatable) way due to the complex interaction of the stimulus with the physical microstructure of the device. This exact microstructure depends on physical factors introduced during manufacture which are unpredictable (like a fair coin). The applied stimulus is called the challenge, and the reaction of the PUF is called the response. A specific challenge and its corresponding response together form a challenge–response pair or CRP. The device's identity is established by the properties of the microstructure itself. As this structure is not directly revealed by the challenge-response mechanism such a device is resistant to spoofing attacks.
PUFs can be implemented with a very small hardware investment. Unlike a ROM containing a table of responses to all possible challenges, which would require hardware exponential in the number of challenge bits, a PUF can be constructed in hardware proportional to the number of challenge and response bits.
Unclonability means that each PUF device has a unique and unpredictable way of mapping challenges to responses, even if it was manufactured with the same process as a similar device, and it is infeasible to construct a PUF with the same challenge–response behavior as another given PUF because exact control over the manufacturing process is infeasible. Mathematical unclonability means that it should be very hard to compute an unknown response given the other CRPs or some of the properties of the random components from a PUF. This is because a response is created by a complex interaction of the challenge with many or all of the random components. In other words, given the design of the PUF system, without knowing all of the physical properties of the random components, the CRPs are highly unpredictable. The combination of physical and mathematical unclonability renders a PUF truly unclonable.
Different sources of physical randomness can be used in PUFs. A distinction is made between PUFs in which physical randomness is explicitly introduced and PUFs that use randomness that is intrinsically present in a physical system.
Types of PUFs
All PUFs are subject to environmental variations such as temperature, supply voltage and Electromagnetic interference, which can affect their performance. Therefore, rather than just being random, the real power of a PUF is its ability to be different between devices, but simultaneously to be the same under different environmental conditions.
PUFs using explicitly-introduced randomness
This type of PUF can have a much greater ability to distinguish devices from one another and have minimal environmental variations compared to PUFs that utilize intrinsic randomness. This is due to the use of different underlying principles and the ability for parameters to be directly controlled and optimized.
- Optical PUF
- Coating PUF
PUFs using intrinsic randomness
Unlike PUFs that utilize explicitly-introduced randomness, PUFs using intrinsic randomness are highly attractive because they can be included in a design without modifications to the manufacturing process.
- Delay PUF
- SRAM PUF
- Butterfly PUF
- Bistable ring PUF
- Magnetic PUF
- Metal Based PUF
Error correction
In different application it is important that the output is stable. If the PUF is used for a key in cryptographic purposes it is necessary that error correction will be done. In principle there are two basic concepts: Pre-Processing and Post-Processing Error Correction.[7] [8]
Attacks on PUFs
Proposed PUFs are not necessarily unclonable and many have been successfully attacked in a laboratory environment.[9]
Despite being named "physical unclonable", a research team from Berlin Institute of Technology was able to clone an SRAM PUF within 20 hours using tools readily available in university failure analysis labs.[9] In this work only SRAM (Static RAM) cells of a microcontroller were read out.
From 2010 onwards till 2013, PUF gained attention in the smartcard market as a promising way to provide “silicon fingerprints”, creating cryptographic keys that are unique to individual smartcards.[10][11] However, university research has shown that delay-based PUF implementations are vulnerable to side channel attacks[12][13] and recommends that countermeasures be employed in the design to prevent this type of attack. Also, improper implementation of PUF could introduce "backdoors" to an otherwise secure system.[14][15] In June 2012, Dominik Merli, a scientist at Fraunhofer Research Institution for Applied and Integrated Security (AISEC) further claimed that PUF introduces more entry points for hacking into a cryptographic system and that further investigation into the vulnerabilities of PUFs is required before PUFs can be used in practical security-related applications.[16] The presented attacks are all on PUFs implemented in insecure systems, such as FPGA or Static RAM (SRAM). It is also important to ensure, that the environment is suitable for the needed security level.[7]
See also
- Kill switch
- Hardware Trojan
- Quantum Readout of PUFs
References
- ↑ D.W. Bauder, "An anti-counterfeiting concept for currency systems", Research report PTK-11990. Sandia National Labs. Albuquerque, NM, 1983.
- ↑ G. Simmons, "A system for verifying user identity and authorization at the point-of sale or access," Cryptologia, vol. 8, no. 1, pp. 1–21, 1984.
- ↑ G. Simmons, "Identification of data, devices, documents and individuals," in IEEE International Carnahan Conference on Security Technology, 1991, pp. 197–218.
- ↑ David Naccache and Patrice Frémanteau, Unforgeable identification device, identification device reader and method of identification, August 1992.
- ↑ Pappu, R.; Recht, B.; Taylor, J.; Gershenfeld, N. "Physical one-way functions". Science 297 (5589): 2026–2030. doi:10.1126/science.1074376.
- ↑ Blaise Gassend, Dwaine Clarke, Marten van Dijk and Srinivas Devadas. Silicon Physical Random Functions. Proceedings of the Computer and Communications Security Conference, November 2002
- ↑ 7.0 7.1 Christoph, Boehm (2012). Physical Unclonable Functions in Theory and Practice. Springer.
- ↑ C. Bohm, M. Hofer, and W. Pribyl, "A microcontroller sram-puf," in Network and System Security (NSS), 2011 5th International Conference on, sept. 2011, pp. 269–273.
- ↑ 9.0 9.1 Helfmeier, Clemens; Nedospasov, Dmitry; Boit, Christian; Seifert, Jean-Pierre (2013). Cloning Physically Unclonable Functions. IEEE Hardware Oriented Security and Trust (IEEE HOST 2013). June 2–3, 2013 Austin, TX, USA.
- ↑ Clarke, Peter (22 February 2013). "London Calling: Security technology takes time". EE Times. UBM Tech Electronics. Retrieved 1 July 2013.
- ↑ "NXP and Intrinsic-ID to raise smart chip security". EE Times (UBM Tech Electronics). 21 January 2010. Retrieved 1 July 2013.
- ↑ Merli, Dominik; Schuster, Dieter; Stumpf, Frederic; Sigl, Georg (2011), "Side Channel Analysis of PUFs and Fuzzy Extractors", Trust and Trustworthy Computing. 4th International Conference, TRUST 2011, Pittsburgh, PA, USA, June 22-24, 2011. Proceedings, Lecture Notes in Computer Science 6740, Springer Berlin Heidelberg, pp. 33–47, doi:10.1007/978-3-642-21599-5_3, ISBN 978-3-642-21598-8
- ↑ Schuster, Dieter (2010). Side-Channel Analysis of Physical Unclonable Functions (PUFs) (Diploma). Technische Universität München.
- ↑ Rührmair, Ulrich; van Dijk, Marten (2013). PUFs in Security Protocols: Attack Models and Security Evaluations. 2013 IEEE Symposium on Security and Privacy . May 19–22, 2013 San Francisco, CA, USA.
- ↑ Katzenbeisser, Stefan; Kocabas, Ünal; Rožic, Vladimir; Sadeghi, Ahmad-Reza; Verbauwhede, Ingrid; Wachsmann, Christian (2012), "PUFs: Myth, Fact or Busted? A Security Evaluation of Physically Unclonable Functions (PUFs) Cast in Silicon", Cryptographic Hardware and Embedded Systems – CHES 2012. 14th International Workshop, Leuven, Belgium, September 9-12, 2012. Proceedings, Lecture Notes in Computer Science 7428, Springer Berlin Heidelberg, pp. 283–301, doi:10.1007/978-3-642-33027-8_17, ISBN 978-3-642-33026-1
- ↑ Merli, Dominik (2012). Hardware Attacks on PUFs. Proceedings AHS2012, NASA/ESA Conference on Adaptive Hardware and Systems. June 25 – 28, 2012 Erlangen, Germany.
External links
- "Physical Unclonable Functions and Applications", by Srini Devadas and others, MIT
- Ultra-low-cost true randomness AND physical fingerprinting
- "Relying on untrusted devices?", by Eric Sivertson, EETimes