Physical Unclonable Functions: A Primer: PUF Circuit Designs
Physical Unclonable Functions: A Primer: PUF Circuit Designs
Building Security In
Physical unclonable functions (PUFs) make use of the measurable intrinsic randomness of
physical systems to establish signatures for those systems. A PUF is like a fingerprint or
biometric for a physical object where each instantiation of that object has its own unique PUF
response. This response differs significantly from all other instantiations of the object, but a
single object has little variation from one measurement to the next. A PUF can be based on
measurements of a wide variety of physical parameters but PUFs extracted from measurements
of integrated circuits are particularly useful because the output is easily incorported into
computational operations. With this primer, we provide an introduction to some common PUF
circuit designs, approaches for generating PUF responses and applications of those responses,
and we identify some open research problems.
Consider a production run of “identical” integrated circuits (ICs), each of which is equipped with
a PUF. If we choose one of the ICs and repeatedly measure its PUF, we will obtain almost the
same response each time. However, if we then select a second IC and measure its PUF, we will
find that its response is much different from the first IC’s response.
The earliest work on the use of physical variations for authentication occurred in the 1980s.1,2
These early systems suggested that the stochastic (random) physical arrangement of small
optical fibers within a material, such as currency, could be used to authenticate that material.
Nearly 20 years later, the first electronic circuit designs aimed at exploiting manufacturing
variations in microelectronics appeared, initiating an intense interest in PUFs.3
Two well-known electronic PUFs are the arbiter and ring oscillator PUF (ROPUF) designs.4 Both
of these designs exploit unique variations in the propagation delays through interconnects and
logic gates. The arbiter PUF, shown in Figure 1a, comprises two identically laid out paths. To
measure a response bit from the arbiter, a rising edge is input simultaneously to the two paths.
This input signal races through the circuit, and a latch at the output detects which path
propagated the input signal faster. For parallel circuits A and B, if A is faster than B, then we
assign bit 0. If B is faster than A, then we assign bit 1. By comparing many delay paths, we can
generate a bitstream.
The arbiter circuit in Figure 1a shows a multiplexed circuit that optimizes area efficiency via an
arbiter that is assembled with stages. For each stage si, an input challenge bit c[si] determines
whether the signal paths cross. In this way, 2n different configurations of the circuit are possible,
allowing 2n bits to be extracted from it. The arbiter PUF requires that its two paths be
symmetric, which can be achieved in ICs with manual placement and routing. However, the
arbiter PUF is difficult to implement in field-programmable gate arrays (FPGAs) because of the
lack of fine control over layout.
Figure 1. (a) The arbiter physical unclonable function (PUF) creates a race along two signal paths
of nearly equal delay. Unique variations in propagation delays through interconnects and logic
cause one path to have less delay than the other. (b) The ring oscillator PUF (ROPUF) generates
response bits by comparing the oscillation frequencies of identically laid out ring oscillators. (c)
The power-on value of a static random-access memory (SRAM) cell can also be used as a PUF.
The ROPUF, depicted in Figure 1b, is another widely studied design that, although larger than
the arbiter PUF, is more suitable for FPGAs. Like the arbiter PUF, the ROPUF also exploits
variations in propagation delays to generate response bits. In this PUF, delay variations manifest
as differences in the oscillation frequencies of identically laid out ring oscillators. Comparing the
frequencies from two oscillators allows generation of one PUF bit. A challenge can be input to
the multiplexor to select which two oscillators are involved in the comparison.
The static random-access memory (SRAM) PUF is another popular design.5 Figure 1c shows a
simple depiction of an SRAM cell. Applying an input voltage vin to the circuit forces the cell into a
stable state. However, with no applied voltage (Va = Vb = 0), the cell is unstable. When power is
applied, the cell transitions to a stable state with either Va or Vb—but not both—at logical 0. This
power-on value can be used as a PUF bit. Any SRAM with symmetric cells can be used in this
way, provided that the SRAM can be powered on in an uninitialized state.
PUFs can be divided into two classes: those that have a large, preferably exponential, space of
input challenges and those that have perhaps only a single challenge. The arbiter and ROPUF
designs have large challenge spaces, whereas the SRAM PUF has a single, fixed challenge.
A typical use for a PUF with a small input space is key generation. In this setting, the PUF’s
response to a fixed challenge is used as a cryptographic key or as a seed for a key generation
algorithm. PUF responses can have noise and can vary with environmental conditions, so using
them for key generation often requires use of a fuzzy extractor, which combines error correction
with hash-based entropy amplification. The key that results from this process can be used in any
application that requires cryptography.6
PUFs with larger input spaces are often proposed for use in interactive challenge–response
protocols. During an enrollment phase, the user chooses a subset of the PUF’s large input space
and measures the responses to each of these challenges. The challenges and corresponding
responses are stored for later use. To authenticate a device containing the PUF, a challenge is
selected from the stored database and presented to the PUF. If the PUF’s response is close
enough to the stored response, the device is deemed authentic. To prevent replay attacks, each
challenge should be used only once.
To mitigate privacy concerns associated with using PUFs in a manner that requires storage of
information in a database maintained by the IC manufacturer, the manufacturer could store a
serial number and PUF-derived public key associated with a device. The consumer could then
query the manufacturer to determine the authenticity of a purchased product, and then re-
enroll the PUF to establish a new signature for the device. This would break the link between
the PUF and the manufacturer, preventing the manufacturer from tracking the device, but also
preventing the consumer (original or after resale) from using the PUF to verify the device as
authentically manufactured after the re-enrollment. In the case of challenge-response PUFs with
a large input space, the user could simply establish a new set of challenge-response pairs that
would be unknown to the manufacturer.
In both of these scenarios, the PUF eliminates the need to store secrets on the chip in
nonvolatile memory. In PUFs with small input spaces, the secret never has to leave the chip. In
the interactive challenge–response use case, the (challenge, response) pairs measured during
enrollment must be protected as secrets, but they don’t need to be stored on the chip. In both
scenarios, the secrets can be measured from the PUF when they’re needed, and then erased
from volatile memory. This greatly reduces the key’s exposure to attackers and is a significant
advance in hardware security.
Applications of PUFs
PUFs have been proposed for use in random number generators,7 remote attestation,8
protecting intellectual property9, and authentication.10 The utility of PUFs for these applications
is dependent on the reliability of the randomness of the PUF output. There are statistical tests
for assessing random number generators11. While not all of these apply to the relatively short
binary strings produced by PUFs, the applicable tests can be used to increase confidence in the
randomness of PUF responses. A related question is to determine how many unique output
variants given a particular PUF design. While it may not be possible to establish the number of
unique variants of a PUF that are physically, as opposed to theoretically, realizable, we can
estimate this quantity. Provided a collection of distinct ICs that contain nominally identical PUF
circuits, we can compare the response from the PUF in one IC to the response from the PUFs in
the other ICs by finding the fractional Hamming distance between their responses. This distance
is referred to as the inter-device variation, which we denote as Va. Given a large enough
collection of ICs we can estimate a conservative upper bound on the number of unique variants
by 2minVa*N, where minVa is the minimum observed fractional Hamming distance and N is the
number of bits measured. The lower bound may be reduced due to systematic variations or
other effects that limit the number of realizable responses. Establishing this bound is an open
problem.
We typically think of access controls in terms of gating human entry into systems and networks,
but we can extend the concept of access controls to the supply chain by using PUF-based
authentication via a challenge–response protocol to gate acceptance of an IC into the supply
chain or into a high-consequence system. The value of this form of authentication is that it can
be repeated after a system is assembled and deployed, or even resold, to ensure that the ICs
within it have not been replaced. This provides protection against counterfeit insertion
throughout a system’s lifecycle.
Despite being termed “unclonable functions,” there have been some successful modeling and
cloning attacks against PUFs.12,13 Machine-learning techniques can model simulated ROPUFs
and arbiter PUFs with accuracies exceeding the experimental stability of those designs. These
models assume that the attacker has access to many thousands of (challenge, response) pairs,
which can be obtained through measurement of a device under the attacker’s control or by
stealing the list generated during enrollment for training the models. The models can then
predict the responses to new challenges. However, it’s important to note that a successful
attack on a PUF compromises only that specific instantiation of the PUF. The attacker must
repeat the process to learn the response of the next instantiation of the PUF.
PUFs that don’t use an interactive challenge–response mechanism aren’t susceptible to these
modeling attacks. However, the fuzzy extractors typically employed in such PUFs are subject to
side-channel and template attacks.14,15 Researchers have demonstrated the ability to physically
clone SRAM PUFs by using near-infrared emissions to characterize the response of one SRAM
PUF, followed by focused ion beam (FIB) circuit edits to induce the same PUF response in a
second circuit. Propagation delays in ring oscillators can also be adjusted with FIB circuit edits,
so delay-based PUFs are likely also susceptible to cloning attacks if the adversary can adequately
characterize the device to be cloned.16 Given these vulnerabilities, it’s important that
researchers continue developing PUFs that are resistant to modeling and cloning, and
attempting to model or clone newly proposed designs. This will ideally allow the community to
converge on preferred PUF designs. Recent research on eliminating the need for error
correction complicates side-channel attacks,17,18 resulting in modeling-resistant PUFs.19,20,21
Long-term reliability studies and accelerated aging experiments on a large population of chips
should be performed to show the viability of PUFs for deployment.
There are also infrastructure challenges. A large company might sell hundreds of millions of ICs
per year. If each of these is equipped with a PUF for use in authentication and anti-
counterfeiting, then these manufacturers will require an authentication infrastructure capable
of supporting billions of devices. Developing, deploying, and maintaining such large-scale
infrastructures will present challenges for IC manufacturers, although there are existing public-
key infrastructures supporting millions of users.22
We will also need effective approaches for “rekeying” PUFs. For example, an IC manufacturer
may measure some (challenge, response) pairs from a PUF, and the consumer might use one of
these to verify the authenticity of a newly purchased IC. However, that user might want to
generate a new set of (challenge, response) pairs that can’t be known by the manufacturer,
which might require altering the PUF in some way. Controlled PUFs might offer a solution to this
problem.23 In the case of a fixed-challenge PUF with a fuzzy extractor, the user can repeat the
enrollment process to generate a new key that will be unknown to the manufacturer.
Ensuring the stability of the PUF output over time is another challenge. It is likely that PUF
responses will change over time as a result of aging effects such as negative bias temperature
instability or electromigration, although little work has been published on the subject.24 If the
impact of aging is small, then it can be corrected with fuzzy extractors, and it should be possible
to detect and mitigate larger aging-induced changes.25
Finally, there are opportunities to build standardized security policies around PUFs. Widespread
adoption of PUFs could be facilitated by establishing industry standards for authentication and
key generation using PUFs. Such standards would expedite common interfaces and utilities for
allowing consumers to verify their devices’ authenticity and would promote adoption of the
technology. This would help to reduce the proliferation of counterfeit electronics in consumer,
critical infrastructure, and military systems.26,27
References
1. D.W. Bauder, An Anti-counterfeiting Concept for Currency Systems, research report PTK-11990, Sandia
National Labs, 1983.
2. G. Simmons, “A System for Verifying User Identity and Authorization at the Point-of Sale or Access,”
Cryptologia, vol. 8, no. 1, 1984, pp. 1–21.
3. B. Gassend et al., “Silicon Physical Random Functions,” Proc. ACM Conf. Computer and Communications
Security (CCS 02), 2002, pp. 148–160.
4. G.E. Suh and S. Devadas, “Physical Unclonable Functions for Device Authentication and Secret Key
Generation,” Proc. Design Automation Conf. (DAC 07), 2007, pp. 9–14.
5. D. Holcomb, W. Burleson, and K. Fu, “Initial SRAM State as a Fingerprint and Source of True Random
Numbers for RFID Tags,” Proc. RFID Security Conf., 2007,
http://www.rfidsec07.etsit.uma.es/slides/papers/paper-12.pdf.
6. Y. Dodis et al., “Fuzzy Extractors: How to Generate Strong Keys from Biometrics and Other Noisy Data,”
SIAM J. Computing, vol. 38, no. 1, 2008, pp. 97–139.
7. A. Maiti, R. Nagesh, A. Reddy, and P. Schaumont, “Physical Unclonable Function and True Random Number
Generator: A Compact and Scalable Implementation,” Proc. ACM Great Lakes Symp. VLSI (GLSVLSI 09), 2009,
pp. 425–428.
8. S. Schulz, A.R. Sadeghi, and C. Wachsmann, “Short Paper: Lightweight Remote Attestation Using Physical
Functions,” Proc. ACM Conf. Wireless Network Security (ACM WiSec), 2011, pp. 109–114.
9. Y. Alkabani, F. Koushanfar, and M. Potkonjak, “Remote Activation of ICs for Piracy Prevention and Digital
Right Management,” Proc. IEEE/ACM Int’l Conf. Computer-Aided Design (ICCAD 07), 2007, pp. 674–677.
10. J. R. Hamlet, T. M. Bauer, and L. G. Pierson, “Deterrence of Device Counterfeiting, Cloning, and Subversion
by Substitution Using Hardware Fingerprinting,” U.S. Patent 8 848 905, Sep. 30, 2014.
11. A. Rukhin, J. Soto, J. Nechvatal, M. Smid, and E. Barker, “NIST SP 800-22: A Statistical Test Suite for Random
and Pseudorandom Number Generators for Cryptographic Application.” National Institute of Standards and
Technology, U.S. Department of Commerce, NIST SP 800-22, May 2001.
12. U. Rührmair et al., “Modeling Attacks on Physical Unclonable Functions,” Proc. ACM Conf. Computer and
Communications Security (CCS 10), 2010, pp. 237–249.
13. C. Helfmeier et al., “Cloning Physically Unclonable Functions,” IEEE Int’l Symp. Hardware-Oriented Security
and Trust (HOST 13), 2013, pp. 1–6.
14. D. Karakoyunlu and B. Sunar, “Differential Template Attacks on PUF Enabled Cryptographic Devices,” IEEE
Int’l Workshop Information Forensics and Security (WIFS 10), 2010, pp. 1-6.
15. D. Schuster, “Side-Channel Analysis of Physical Unclonable Functions (PUFs),” PhD dissertation, Deparment
of Computer Science, Technische Universität München, 2010,
http://www.sec.in.tum.de/assets/studentwork/finished/Schuster2010.pdf.
16. R. Schlangen et al., “RF Performance Increase Allowing IC Timing Adjustments by Use of Backside FIB
Processing,” IEEE Int’l Symp. Physical and Failure Analysis of Integrated Circuits (IPFA 09), 2009, pp. 33-36.
17. M. Bhargava and K. Mai, “A High Reliability PUF Using Hot Carrier Injection Based Response Reinforcement,”
Proc. 15th Int’l Workshop Cryptographic Hardware and Embedded Systems (CHES 13), 2013, pp. 90–106.
18. J. Aarestad, J. Plusquellic, and D. Acharyya, “Error-Tolerant Bit Generation Techniques for Use with a
Hardware-Embedded Path Delay PUF,” IEEE Int’l Symp. Hardware-Oriented Security and Trust (HOST 13),
2013, pp. 151–158.
19. R. Kumar and W. Burleson, “On Design of a Highly Secure PUF Based on Non-linear Current Mirrors,” IEEE
Int’l Symp. Hardware-Oriented Security and Trust (HOST 14), 2014, pp. 38-43.
20. M. Bhargava, C. Cakir, and K. Mai, “Attack Resistant Sense Amplifier Based PUFs (SA-PUF) with Deterministic
and Controllable Reliability of PUF Responses,” IEEE Int’l Symp. Hardware-Oriented Security and Trust (HOST
10), 2010, pp. 106–111.
21. M. Yu et al., “A Noise Bifurcation Architecture for Linear Additive Physical Functions,” IEEE Int’l Symp.
Hardware-Oriented Security and Trust (HOST 14), 2014, pp. 124-129.
22. “Red Hat and August Schell Run World’s Largest PKI Installation on Red Hat Enterprise Linux,” Red Hat
Government Team, June 2007; www.redhat.com/about/news/archive/2007/6/red-hat-and-august-schell-
run-worlds-largest-pki-installation-on-red-hat-enterprise-linux.
23. B. Gassend et al., “Controlled Physical Random Functions,” Proc. 18th Ann. Computer Security Applications
Conf. (ACSAC 02), 2002, p. 149.
24. Maiti, Abhranil, Logan McDougall, and Patrick Schaumont. "The impact of aging on an FPGA-based physical
unclonable function." Field Programmable Logic and Applications (FPL), 2011 International Conference on.
IEEE, 2011.
25. Kirkpatrick, Michael S., and Elisa Bertino. "Software techniques to combat drift in puf-based authentication
systems." Workshop on Secure Component and System Identification (SECSI 2010). 2010.
26. C. Kazmierski, “SIA President Testifies at Senate Armed Services Committee on Dangers of Counterfeit
Chips,” Nov. 2011;
www.semiconductors.org/news/2011/11/08/news_2011/sia_president_testifies_at_senate_armed_service
s_committee_on_dangers_of_counterfeit_chips.
27. “Background Memo: Senate Armed Services Committee Hearing on Counterfeit Electronic Parts in the DOD
Supply Chain,” Nov. 2011; www.levin.senate.gov/newsroom/press/release/background-memo-senate-
armed-services-committee-hearing-on-counterfeit-electronic-parts-in-the-dod-supply-chain.
Todd Bauer is a Member of the Technical Staff at Sandia National Laboratories. Contact him at
tmbaue@sandia.gov.
Jason Hamlet is a Senior Member of the Technical Staff at Sandia National Laboratories. Contact him at
jrhamle@sandia.gov.