Slides - University of Wisconsin–Madison

advertisement
A Sensor-Assisted Self-Authentication
for Hardware Trojan Detection
Min Li*, Azadeh Davoodi*,
Mohammad Tehranipoor**
*University of Wisconsin-Madison
**University of Connecticut
WISCAD
Electronic Design Automation Lab
http://wiscad.ece.wisc.edu
Challenges of Hardware Trojan Detection
• Challenges:
– lack of observability and controllability after fabrication
– complexity
• due to existence of billions of nano-scale components
• due to high volume of soft and hard integrated IP cores
– overhead associated with physical inspection of nanometer
feature sizes for reverse engineering
• could be intrusive
– difficulty to activate a Trojan
– increasing fabrication and environmental variations with
technology scaling
2
Fundamental Challenge
• Trojan-free or Golden IC (GIC)
– required in any (generic) IC authentication process
• create a reference fingerprint from the transient behavior of GIC and
compare with fingerprint obtained from target IC
– existence and identification of GIC cannot be guaranteed
• if inserted in GDSII file, or if the foundry alters the mask to insert a
Trojan, GIC will not exist
• if an IC passes a rigorous test, in theory one cannot conclude that it is
a GIC
3
Contributions
• A framework to use custom-designed on-chip detection
sensors to alleviate the need on a golden IC by providing
a self-authentication
• On-chip “detection sensor”
– a compact (small area) representation of a design
• can be designed by searching for common “features” in a design
– shares common sources of uncertainty with the design due to
realization on the same chip
• e.g., process and environmental variations
• Assumptions
– Trojan may infect the design paths, the detection sensor, or both
– the detection sensor is obfuscated within the design’s layout
• i.e., an adversary will not be able to distinguish it
4
Proposed Framework
Design stage
Design and integration of
custom-generated detection
sensors
capturing within-die variability
Post-silicon self-authentication process
On-chip delay
fingerprint of
detection
sensors
On-chip delay
fingerprint of
arbitrary
design paths
Offline
analysis of
fingerprint
correlation
PASS
?
NO
Alert Trojan
5
Design stage
Post-Silicon
detection sensor:
a compact
representative of the
design
measured on-chip
delays of design paths
and of detection sensors
analyze correlation
detect Trojan
6
-- Finds most frequent
“layout features” which are
design-dependent and
technology-sensitive
-- Main Idea:
Addition of Trojan disturbs
the expected delay
correlation between the
design and the detection
sensor
Design of the Detection Sensor
• Steps
– logic design via netlist analysis
1. sequence matching
2. feature discovery
– physical design of sensors
• layout integration
• delay measurement
7
Design of A Detection Sensor
• An optimization framework for finding frequent
sequences in a netlist*
– modeled using a graph representation of the design’s netlist
– break (all or some portions) of the graph into collection of “similar”
sequences
– similar sequences grouped together and represented by one
sensor
– given a budget for total area used by the detection sensors, the
goal is to maximize coverage of graph with formed sequences
8
*Li and Davoodi, “Custom On-Chip Sensors for Post-Silicon Failing Path
Isolation in the Presence of Process Variations”, Technical Report, 2011
Design of the Detection Sensor
• Constraints:
1.
sequence constraints
•
2.
similarity constraints
•
•
3.
a sequence is made of
consecutive edges
“similar” sequences are
mapped to the same sensor original netlist
similarity defined based on
delay correlation in the
presence of uncertainties
such as process variations
area constraint
• summation of the areas of
detection sensors are
bounded
9
extended graph after
modification
simplified version for
illustration
Variation-Aware Delay Modeling
a
b
c
0, 1
a
1 ,2
1,1
1,4
d a   a  s La  L a
1,3
2,6
d b   b  s Lb  Lb
2,5
2,2
2,1
d c   c  s Lc  Lc
2,8
2,7
2 ,4
2,3
2,1 4
2,13
2,10
2,9
2,16
2,15
2,12
2 ,11
[Agarwal et al, ASPDAC’03]
Lc  L2,15  L1,4  L0,1  rc
Lb  L2,4  L1,1  L0,1  rb
10
La  L2,1  L1,1  L0,1  ra
Design of A Detection Sensor
• Benefits of the formulation
– flexible definition of similarity between sequences mapped to the
same sensor,
• for example similar sequences could be:
–
–
–
–
structually identical (e.g., same sequence of logic gates)
have the same timing distribution
highly correlated in their timing characteristic
in general can define similarity with respect to sensitivity to technology
parameters
– flexible objective
• if edge weights are equal, objective is maximizing netlist coverage
• can modify to also ensure spatial coverage from different regions of
the chip
11
Design
Post-Silicon
detection sensor:
a compact
representative of the
design
measured on-chip
delays of design paths
and of detection sensors
analyze correlation
detect Trojan
12
-- e.g., BIST technology
-- First check BIST is
healthy (e.g., by verifying
delays of embedded ring
oscillators)
On-Chip Path Delay Measurement
• Examples
– Path-RO [Tehranipoor et al ICCAD08]
• requires inserting measurement circuitry at the pre-silicon stage along
the desired representative paths
– Shrinking clock signal [Abraham et al GLSVLSI 2010]
13
Design
Post-Silicon
detection sensor:
a compact
representative of the
design
measured on-chip
delays of design paths
and of detection sensors
analyze correlation
detect Trojan
14
-- Detection scenarios:
• Trojan may be added
to the design, the
detection sensor, or
both
-- The timing correlation
between the design and
its detection sensors will
be different in the
presence of a Trojan
Detection Scenarios
1. Trojan inserted in the design paths
–
–
actual delay range: obtained from direct path delay
measurement considering measurement error
predicted delay range: computed using actual sensor delay and
predicting the remainder of the path using worst/base-case
values
Trojan-infected
path used for
actual delay
range
sensor
15
path used for
predicted
delay range
sensor
matched
case-based
estimate
Detection Scenarios
1. Trojan inserted in the design paths
–
–
sensor
16
underestimation of path delays that are Trojan infected
detects Trojan if predicted delay range does not overlap with the
measured range
Detection Scenarios
2. Trojan inserted in the detection sensor
–
–
actual delay range: obtained from direct path delay
measurement considering measurement error  correct range
predicted delay range: computed using Trojan-infected sensor
delay and predicting the remainder of the path using worst/basecase values
path used for
actual delay
range
sensor
17
path used for
predicted
delay range
sensor
matched
case-based
estimate
Detection Scenarios
2. Trojan inserted in the detection sensor
– overestimation of the predicted range of the design paths
– correctly detects existence of Trojan if the two ranges don’t
overlap
– can identify that detection sensor is infected
sensor
18
Detection Scenarios
3. Trojan inserted simultaneously in the design and
detection sensor
– actual and predicted ranges both erroneous
– depending on how the Trojan impact each one, different cases
can happen
path used for
actual delay
range
path used for
predicted
delay range
sensor
19
Detection Scenarios
3. Trojan inserted simultaneously in the design and
detection sensor
– can only predict that Trojan exists if the predicted and measured
ranges do not overlap, otherwise it doesn’t generate any output
sensor
20
Simulation Setup
• Randomly selected a subset of critical design paths from
the ISCAS89 suite
• For each considered path
– inserted a Trojan at a random location on the path
– sensor area budget is 15%
– repeated many times for varying Trojan delays
• 3 to 10% of the delay of the longest path in the circuit (30 different
values uniformly selected from the range)
– assumed on-chip measurement error of 3% for measuring the
delays of the infected paths
– variation modeling
• assumed process variations in channel length and threshold voltage
of transistors according to variation setting for 45nm technology and a
5-level spatial correlation model
• considered 10K scenarios of variations
21
Trojan Inserted in Design
sensor and path information
DR (Trojan in design)
Bench
|P|
%Asensor
MR
W/O sensors
Sensor-assisted
s1423
92
13
1
0.58
0.68
s1488
16
12
1
0.60
0.67
s1494
16
12
1
0.48
0.50
s5378
201
12
0.84
0.62
0.66
s9234
169
10
1
0.69
0.71
s13207
331
12
1
0.67
0.73
s15850
136
15
0.97
0.75
0.85
s35932
1765
13
0.96
0.70
0.73
s38417
1587
12
0.99
0.72
0.77
s38584
1112
14
1
0.64
0.80
MR: fraction of the paths which are matched with at least one sensor
DR: detection ratio
22
Trojan Insertion in Detection Sensor
sensor and path information
DR (Trojan in design)
Bench
|P|
%Asensor
MR
W/O sensors
Sensor-assisted
s1423
92
13
1
0.03
0.67
s1488
16
12
1
0.04
0.69
s1494
16
12
1
0.04
0.60
s5378
201
12
0.84
0.03
0.58
s9234
169
10
1
0.01
0.75
s13207
331
12
1
0.01
0.77
s15850
136
15
0.97
0.01
0.70
s35932
1765
13
0.96
0.01
0.93
s38417
1587
12
0.99
0.02
0.78
s38584
1112
14
1
0.00
0.97
MR: fraction of the paths which are matched with at least one sensor
DR: detection ratio
23
Trojan Detection Rate in s13207
detection
rate
% Trojan delay/delay of longest path
24
Conclusions
• Benefits of the detection framework
–
–
–
–
–
alleviates the need on a GIC
does not output a wrong answer
detection at a finer granularity
faster detection of Trojan
captures both layout-dependency and technology-dependency
• Limitations
– spatial correlation modeling
– path delay measurement accuracy
– layout integration
25
Download