PHYSICAL UNCLONABLE FUNCTIONS AND SECURE PROCESSORS

Srini Devadas
CSAIL

Executive Summary

Secret keys embedded in portable devices play a central role in protecting their owners from electronic fraud. Keys in consumer devices also protect content providers from illegal use of delivered content by the owner of the device. Currently, those keys are vulnerable to invasive or physical attack by a motivated adversary or owner, as available protection schemes are too expensive and bulky for most applications.

We propose to let the chip itself act as the key. At the microscopic scale, circuits are never identical, even on chips manufactured the exact same way, and signals take different times to propagate through silicon and metal paths. By comparing a few hundred pairs of path delays, we can generate a unique fingerprint for each apparently identical chip. That fingerprint – recorded when the chip is made and stored in a database – can act as a key to, for instance, authenticate an online transaction, or unlock proprietary software.

This technique is a significantly less expensive way of embedding secret keys in devices than using active intrusion detectors, which also require continual power. It is particularly suitable for authenticating or storing sensitive information on portable devices such as cell phones, PDAs, key cards and smart cards.

Description

As computing devices become ever more pervasive, two contradictory trends are appearing. On one hand computing elements are becoming small, disseminated and unsupervised. On the other hand, the cost of security breaches is increasing as we place more responsibility on the devices that surround us. In order to authenticate devices, and protect the integrity and privacy of data on devices, we need to store a secret key in the device. This key has to be protected against software and physical attacks that attempt to discover it. If an adversary can discover the secret key by any means, he can spoof the device or illegally gain access to sensitive data. For example, many PDAs or cell phones are Internet enabled and contain credit card and other pertinent financial information, others used in the healthcare industry contain keys or passwords to confidential health information, and still other PDAs provide access to private corporate networks. Even if a PDA is lost, it must be secure: specifically it must prevent unauthorized users from breaking into the PDA or obtaining any valuable information through the use of the PDA.

The conventional means of protecting a secret key in a hardware device is to use active intrusion detection circuitry. This typically results in an expensive and bulky package rendering this method untenable for many portable devices. Moreover, this method requires continual power – when the power is shut off the key is erased – and this increases the power consumption of the hardware device. This means that devices without a power source such as a smart card cannot use this method. Yet another disadvantage of this method is that placing a different key in each hardware device adds expense to the manufacturing process. To avoid this expense, manufacturers sometimes place the same key in a host of devices, which exacerbates security flaws (e.g., the security flaw in Microsoft’s X-Box was exacerbated by all the X-boxes having the same key).

Physical Unclonable Functions

Over the past eighteen months we have proposed and developed the notion of Physical Random Functions (or PUF as in Physical Unclonable Function), which rely on the inevitable manufacturing variations between devices to produce an identity for a device. This identity is unclonable, and in some cases is even manufacturer resistant (i.e., it is impossible to produce devices that have the same identity).

PUFs can be realized on silicon chips. Due to manufacturing variation, the delays of transistors and wires in digitally identical circuits are different across different chips. There is enough variation across chips so a precise measurement of delay will identify a chip. However, delays change with environmental conditions such as temperature, which can lead to unreliable identification. On-chip temperature itself may change based on other logic that is used or software that is run. Therefore, rather than using absolute delay measurement, we use relative delay measurement, i.e., delay comparisons. This use of comparisons is a novel aspect of our work that differentiates it from other techniques that exploit statistical variation and biometric techniques, and is the key idea that enables reliable identification. In the Appendix, we describe our technique and experiments in greater detail.

PUFs as described have two significant advantages over the conventional methods previously described. The first is that the secret is not a digital secret but corresponds to the path delays in the circuit. These delays are not just numbers; rather they are non-linear functions of the applied input challenge, and circuit topology. In order to clone a PUF, an adversary has to learn all of these delays very precisely. Invasive attacks to precisely measure delays will fail because stripping away metal to measure delays of underlying transistors or wires will result in the delays changing before they can be measured. Non-invasive attacks such as Differential Power Analysis also do not give information about wire delays. The second advantage is that PUFs by definition result in different “keys” being placed in each hardware device; no two devices are the same. Thus, there needs to be no change in the manufacturing process for individual devices. The main engineering issue is reliability, and relative delay measurement produces high reliability even under significant environmental variations – some details are given in the Appendix.

As part of this project, we will target specific authentication applications in the consumer and military domains, and design PUFs that meet industry standards for security and reliability to enable licensing opportunities.

Controlled Physical Unclonable Functions

PUF technology can be made considerably more powerful through the notion of control. PUFs without control can be used for authenticated identification applications, but cannot be used, for instance, to sign a digital certificate, in a way that a processor with a hidden private key of a public/private key pair can be used. The second part of our project is building a controlled PUF (CPUF), which can be implemented as a programmable processor built around a PUF. A CPUF can be used to establish a shared secret between a physical device and a remote user. We have developed protocols when run on the processor make this possible in a secure and flexible way, even in the case of multiple mutually mistrusting parties. Once established, the shared secret can be used to enable a wide range of applications, for example, certified execution, where a certificate is produced that proves that a specific computation was carried out on a specific processor chip. Other applications include software licensing and intellectual property (IP) protection where software can be created which can only run on a particular processor chip. The protocols we have developed are a significant part of the intellectual property required to make these applications possible.

As part of this project, we propose to build a programmable processor around a PUF, i.e., a secure processor. We will use the OpenRISC ( programmable processor and add a PUF module and a small number of new instructions to it, which control and enable access to the PUF. This will allow the implementation of the software protocols for the certified execution, software licensing and IP protection. We envision prototyping the secure processor on an FPGA platform, and then building it on custom silicon. One reason to undertake a custom silicon implementation is to prove that statistical variation in commercially available manufacturing processes is enough to create PUFs and CPUFs.

Conventional secure processors (e.g., the IBM 4758 which was commercially available for a few years) store a private key in a processor and protect the processor and memory using intrusion detection circuitry. These processors are bulky, expensive, and need to be continually battery-powered, and were not successful commercially mainly because they were not usable in portable computing applications where an adversary can obtain physical access to the device. We aim to eliminate the requirement for expensive packages that include intrusion detection circuitry and make these applications viable.

Impact

Phones and smart cards are susceptible to cloning attacks. Parties with physical access to a victim's GSM cell phone can “clone'' the phone and fraudulently place calls billed to the victim's account. While this is not a total failure of the authentication framework since the attacker needs physical access, in many phones, physical access is required only for a matter of minutes. In fact, black boxes to automate the cloning process have been on the market for a few years. It is estimated that this type of fraud costs European providers $200 million each year, though it should be noted that this estimate is quite rough. It now appears that over-the-air attacks can be carried out on some types of phones – a very serious security flaw that can result in significantly higher fraud costs. The reason cloning is possible is because cell phones do not have tamper resistance – one cannot afford to build in expensive tamper-resistant technology or have battery powered intrusion detection.

The security problem for phones is growing more serious because cell phones are being used more and more for commerce and payment processing. Security infractions on smart cards are also costing credit card companies billions of dollars. While PUF technology cannot protect against all types of cloning attacks (for example, giving away your credit card information to a bogus vendor), a significant number of cloning attacks can be prevented using PUFs. Further, this can be done without increasing the cost of the smart card or phone significantly.

Chip authentication can be used to ensure that only certain chips certified by a manufacturer are used in products. Replacing the certified chip with a (digital) duplicate will not work because the duplicate will not be authenticated.

Embedded systems are chips that contain hardware processors as well as software codes that run on the processor. Many proprietary embedded system designs are reverse engineered by competitors who can then learn the algorithms used in the chip, and replicate them with their own processors and software. A manufacturer who is concerned with competitors reverse engineering embedded software on chips can encrypt the software that can only be decrypted by the PUF on the chip. A competitor or foreign user will be unable to decrypt the software or see the decrypted software, since it is decrypted on the fly during execution. This type of protection is of interest to defense contractors such as Northrop Grumman. We note that if a digital key was stored on the processor to perform the decryption, a competitor can discover the key (perhaps destroying the chip in the process) and then decrypt the software.

We now discuss a longer-term set of applications. Software piracy costs the software billions of dollars. A software provider should be able to license content to particular computers/processors while ensuring that the owner of the device cannot easily circumvent the protection mechanisms. While this can be accomplished by storing a secret key in the device that is only known to the software vendor or a third party there is significant consumer opposition to this mechanism since the owner is not allowed to know the secret key. With processor PUFs, there is no secret key in the processor, and all processors are identical, digitally speaking. The software vendor can communicate with the processor PUF and create software such that only that processor can decrypt it.

With digital content, games and movies being piped to people’s homes, cell phones, and PDAs on demand there is an increasing requirement for content protection mechanisms in portable and set-top devices. The processors in these devices have to be capable of certified execution, which corresponds to verifying that an action, such as increasing the count of how many times a movie has been watched, is performed once the movie is played. If the device is continually on the network, sensitive information can all be stored on the server side, but that is not a viable option in many homes. A content provider should also be able to license content to particular devices while ensuring that the owner of the device cannot easily circumvent protection mechanisms, similar to the software licensing application discussed above.

Prior Art

Statistical variation has been exploited to create IC identification circuits that generate a single unique response for each manufactured IC. This approach can identify an IC but cannot authenticate it, since once the IC outputs its digital response, any other device can store it. Our contribution is to show that by exploiting statistical delay variation and measuring transient response, one can generate a unique fingerprint for an IC that cannot be cloned even by the manufacturer. Further, we can embed the PUF in a processor to enable a host of applications, as described above. A patent titled, “Identification and Authentication of Integrated Circuits” was filed in April 2003.

Prototype optical PUFs have been realized using 3-dimensional microstructures and coherent radiation. In an optical PUF, the challenges are the angles at which the coherent light impinges on the structure, and the responses are the images that are generated. Optical PUFs are very hard to clone and can be used for key card applications. In such an application, one has to position the optical PUF quite precisely with respect to the source in order to control the angle at which light impinges on the PUF, else the image will look very different from the expected image, even for the correct PUF. It is also hard to integrate an optical PUF with control logic, and hence these PUFs are currently limited to key card applications.

Appendix

The simple circuit below can be used as a PUF.

It consists of several stages of logic followed by an arbiter. Depending on the input, termed the challenge, applied to the circuit, transitions propagate down different paths to the arbiter. The arbiter produces an output of a 1 or 0 depending on whether the top or bottom path is faster. When environmental conditions change, the paths speed up or slow down in a similar fashion in a symmetric implementation, and the arbiter output does not change. However, different challenges produce different responses that are hard to predict unless the adversary knows the circuit component delays precisely. (We note that the actual circuit used is a slightly more complicated version of the one below, to make the adversary’s task of determining delays given an arbitrary number of challenge-response pairs significantly harder than for this circuit.)

We have fabricated the above PUF circuit and variants on custom silicon and have experimentally validated our hypotheses relating to their security and reliability. Given a random pair of chips, if we apply 100 random challenges to each chip and obtain 100 response bits from each chip:

  1. On average 20 response bits are different across the two chips. Therefore, we can identify chips quite easily. For example, to differentiate between 1 in a billion chips we will need to apply approximately 200 challenges.
  2. Using repeated measurement we are able to eliminate any measurement noise that results due to voltage and environmental noise.
  3. Temperature variation from 25C to 70C results in large changes in absolute delays on a chip, but due to relative delay measurement the average number of challenges that produce errors is less than 2 out of 100, which is an order of magnitude less than inter-chip variation. This means that we can reliably identify chips even when environmental conditions vary significantly.

It appears to be very hard to find an accurate model of a given chip that predicts the output with fewer than 10 errors per 100 challenges, which is significantly higher than the errors due to environmental or temperature variation. This means that we can authenticate, not just identify, chips. Further, adding control logic to the circuit that obfuscates the response will effectively defeat this type of attack.

MIT confidential / 1 / Error! Reference source not found.