Physical Unclonable Functions and The Software Protection Problem

advertisement
A Theoretical Analysis: Physical Unclonable
Functions and The Software Protection Problem
Rishab Nithyanand
Stony Brook University, NY
John Solis
Sandia National Labs, CA
TrustED 2012
Outline
• Introduction to PUFs
• The Software Protection Problem
• Intrinsic Personal PUFs (IP-PUFs)
• Software Protection using IP-PUFs
• Conclusions
5/25/2012
A Theoretical Analysis: PUFS and Software Protection
2
Introduction to PUFs
• Physical Unclonable Functions (PUFs)
– Physical systems whose responses to physical stimuli
are easy to measure but hard to clone
• The manufacturing process is NOT perfect
– Unavoidable and uncontrollable variations
– No two chips are identical
– Essentially Hardware DNA
• Examples: Timing PUFs, Crystal PUFs, Optical PUFs, …
5/25/2012
A Theoretical Analysis: PUFS and Software Protection
3
Applications
• Device authentication
– Identify chips, computing devices
• Software protection and DRM
– ensure software only executed on licensed devices
• Certified execution
– ensure outsourced computations are executed
without tampering
5/25/2012
A Theoretical Analysis: PUFS and Software Protection
4
The Software Protection Problem
Introduction and Goals
5/25/2012
A Theoretical Analysis: PUFS and Software Protection
5
The Software Protection Problem
Introduction and Goals
• Ensure programs are only executed by licensed users
• Several methods to battle software piracy:
– Online Authentication/License Servers
– Obfuscation, WBC, etc…
• However,
– Online authentication requires network connectivity and is
easy to by-pass
– Obfuscation and WBC are known to be impossible
• Can we do better?
– A feasible offline solution that requires no expensive hardware
5/25/2012
A Theoretical Analysis: PUFS and Software Protection
6
The Software Protection Problem
Adversary Models
• Who are we really fighting in the real world?
– Individuals
• download software to “crack” and redistribute
• Typically, do not own licenses
– Institutions
• buy a small number of licenses, use on many machines
• We view PUFs as licenses
• The weak adversary:
– No access to the legitimate PUF
– Full access to binaries
• The strong adversary:
– Access to a small number of legitimate PUFs
– Full access to binaries
5/25/2012
A Theoretical Analysis: PUFS and Software Protection
7
The Software Protection Problem
A Simple W-ADV Secure Scheme
• Requirements of W-ADV secure schemes:
– Protected functionality
– Non-trivial inversion
• Use CFGs to verify PUF existence
• Insert PUF challenge at every node (code block)
– Correct responses take you to the correct subsequent block
– Incorrect responses take you to any of the n blocks in the CFG
• Probability of cracking such software (theoretically): 1/nn
• Probability of cracking such software (practically): higher,
but controllable and reasonable
5/25/2012
A Theoretical Analysis: PUFS and Software Protection
8
The Software Protection Problem
Fighting the Strong Adversary
• Requirements of S-ADV secure schemes:
– Protected functionality
– STRONG non-trivial inversion
• S-ADV strategy:
– Observe software execution on legitimate PUF device, then virtualize
“useful” part of the PUF
• Therefore, any S-ADV secure scheme has to be probabilistic (i.e.,
PUF challenges must be selected at random each time)
• This is impossible without going online (or) without using trusted
hardware
– Our previous CFG technique can be easily extended to being S-ADV
secure using trusted hardware to store a PRP seed and key
5/25/2012
A Theoretical Analysis: PUFS and Software Protection
9
The Software Protection Problem
An (Informal) Impossibility Conjecture
• This argument explains the impossibility of
offline software protection against an S-ADV
• Our program must have randomness
• Probabilistic Turing machines:
– 2 tapes: input tape (r/w enabled), random tape (r enabled)
– More than one entry for each cell of state transition function
– Bits in random tape determine which entry is used
• Offline requirement a read only random taped cannot be
enforced
5/25/2012
A Theoretical Analysis: PUFS and Software Protection
10
The Software Protection Problem
Rethinking the Problem
• (S-ADV secure) Software protection is
impossible using traditional PUFs
– Pointless to use PUFs and trusted hardware
• Traditional PUFs are as expensive as trusted
hardware.
– Avoid using black-box PUFs, they violate our goal
of not using additional hardware
• Certain features of computing devices such as
timing and delay characteristics cannot be
captured using TMs, RAMs, etc.
– Need systems approach to solve this problem
5/25/2012
A Theoretical Analysis: PUFS and Software Protection
11
Outline
• Introduction to PUFs
• The Software Protection Problem
• Intrinsic Personal PUFs (IP-PUFs)
• Software Protection using IP-PUFs
• Conclusions
5/25/2012
A Theoretical Analysis: PUFS and Software Protection
12
Intrinsic Personal PUFs (IP-PUFs)
Definition
• Intrinsic PUFs: PUFs that are intrinsically involved in processing
information that is to be protected against duplication
• Personal because they are present on every personal computing device
• Timing characteristics of regular hardware can be used as an intrinsic-PUF
– Rather than being peripheral devices that are challenged, such PUFs are
involved in the computation actually being performed
– Eg., CPUs, RAM, FPUs, etc
– There is no additional cost to build such PUFs
• Benefits
– PUFs are already widely available
5/25/2012
A Theoretical Analysis: PUFS and Software Protection
13
Intrinsic Personal PUFs (IP-PUFs)
Why your computer is an IP-PUF
• Sources of unclonable randomness:
– CPU Clock
– Timer Interrupt Clock
– Memory (Cache and RAM)
• Clocks
– It is extremely unlikely that two processors have the exact same
operating frequencies (even if they end up in the same bin)
– This is down to unclonable features such as cut of the crystal,
level of impurities, etc
• Memory
– Time delays and dead cells are unique to each memory unit
5/25/2012
A Theoretical Analysis: PUFS and Software Protection
14
Outline
• Introduction to PUFs
• The Software Protection Problem
• Intrinsic Personal PUFs (IP-PUFs)
• Software Protection using IP-PUFs
• Conclusions
5/25/2012
A Theoretical Analysis: PUFS and Software Protection
15
Software Protection Using IP-PUFs
One Idea – Race Conditions
• Create “artificial” race-conditions.
– Remove synchronization (semaphores, locking, etc) from threads.
– Threads resolve correctly only while executing on the correct PUF
• i.e., the “correct clobbering” of variables happens only on the correct system.
– Otherwise, behavior is strange and unexpected.
5/25/2012
A Theoretical Analysis: PUFS and Software Protection
16
Software Protection Using IP-PUFs
Why Race Conditions?
• Does not use the IP-PUF as a black-box device.
• Hard problem to detect and fix
• Debuggers are useless
– interfere with timing of instructions and the way
race conditions are resolved.
5/25/2012
A Theoretical Analysis: PUFS and Software Protection
17
Outline
• Introduction to PUFs
• The Software Protection Problem
• Intrinsic Personal PUFs (IP-PUFs)
• Software Protection using IP-PUFs
• Conclusions
5/25/2012
A Theoretical Analysis: PUFS and Software Protection
18
Main Contributions
• New software protection adversary models
– Weak and Strong variants
– Software protection schemes for each variant
• An impossibility conjecture:
– Traditional PUFs cannot solve offline software
protection problem
5/25/2012
A Theoretical Analysis: PUFS and Software Protection
19
Future Work
• Theory:
– Formal proofs of equivalence of the hardness of obfuscation,
white-box crypto, and offline S-ADV software protection
• Systems:
– Build real applications using IP-PUFs
– Perhaps with race-condition binding?
5/25/2012
A Theoretical Analysis: PUFS and Software Protection
20
Download