Securely Using Untrusted Terminals and Compromised Machines with Human-Verifiable Code Execution

advertisement
Securely Using Untrusted
Terminals and Compromised
Machines with Human-Verifiable
Code Execution
Jason Franklin, Mark Luk, Arvind Seshadri,
Adrian Perrig
How can we trust our devices?
 How do I securely use a device I don’t trust?
• Terminal in an Internet café or conference
 What about compromised devices?
• My PDA, home/work/car computer
Human-Verifiable Code Execution
(HVCE)
 How can I know what is
executing on my PDA?
 Example: encrypted data
stored on PDA, how can I
be assured that no malware
steals password?
 Goal: human-verifiable
code execution to prevent
malware from undetectably
interfering with application
Human-Verifiable Code Execution
 Problem Statement
• Provide a human verifier with a guarantee that a
target executable has been invoked for execution in
an untampered environment, such that no other code
can interfere with its execution
• Guarantee must be obtained without direct assistance
of another computational device and be verifiable by
a human
Past Human-Verifiable Code Execution
 We had human-verifiable code execution
in the past!
 On punchcard-based mainframes,
programmer could hear when his
program was executing!
 Or, in modern times, my machine acts
“strange”…I quickly type:
• > ps auwx
 Can we achieve secure human-verifiable
code execution on today’s platforms?
Outline
 Vision
 Untrusted Terminal Problem
• Binary and Generalized






Hardware-based HVCE
Software-based HVCE
Building verifiable applications
User study
Ongoing Work
Conclusion
Binary Untrusted Terminal Problem
Trusted Terminals
Untrusted Terminals
IBM 4758
`
Generalized Untrusted Terminal Problem
 Levels of trust exist beyond completely trusted and untrusted
Trusted Hardware and Software
Trusted Hardware
Completely Untrusted Terminals
`
Current Approaches
Partial Solutions
Smart Cards
Visual Crypto
PDAs
Outline
 Vision
 Untrusted Terminal Problem
• Binary and Generalized






Hardware-based HVCE
Software-based HVCE
Building verifiable applications
User study
Ongoing Work
Conclusion
Hardware-Based HVCE
 Based on AMD’s Pacifica and Intel’s LT
• Next generation platforms for secure virtualization
• Include hardware Trusted Platform Module (TPM)
• New instructions: skinit (AMD) and senter (Intel)
• Provide guarantee of externally-verifiable code
execution
 Guarantee not useful to human verifier
• Require modifications to allow human verifier
LPC bus
TCG Trusted Platform Module (TPM)
Platform
Non-Volatile
Configuration
Storage
Register (PCR)
(EK AIK, SRK)
I/O
Random
Secure
Number
Hash
Generator
SHA-1
Key
Crypto
Generation
RSA
DIP Packaging or integrated into SuperIO chip
Basic TPM Functionality
 TPM contains 16 program configuration registers
(PCRs) to store integrity measurements
 Operations on PCRs
• TPM_Extend(N, S): PCRN = SHA-1(PCRN | S)
• TPM_Read(N):
Return contents of PCRN
 TPM contains private key to sign attestations and
manufacturer certificate
• Tamper resistant storage for private key K-1AIK
• Manufacturer certificate, for example {KAIK }
K-1IBM
Externally-Verifiable Code Execution
 On call to skinit(target_exe_p)
• CPU transfers executable image to TPM over LPC bus
• TPM hashes executable image and stores in PCR
• CPU sets up untampered execution environment
• Disables interrupts
• Sets up memory protections to disable DMA writes to memory locations
containing target executable
• CPU jumps to target executable and executes
Remote platform
Verifier
What code are
you running?
`
Externally-Verifiable Code Execution
 Verifier sends 160 bit nonce to remote platform
 Remote platform returns signed hash of target executable
and nonce
 Verifier uses remote platform’s public attestation identity key
to check signature and verifies that correct executable has
been invoked
Verifier
Remote platform
160 bit nonce
Sign (H (target_exe, nonce), AIK_priv)
`
Enabling a Human Verifier
Verifier
Untrusted
Remote platform
Terminal
160 bit nonce
Sign (H (target_exe,
nonce), AIK_priv)
H(target_exe)
`
 Human needs list of challenge response pairs
• Generating pairs requires access to AIK_priv
• Only available to TPM
• Can’t trust software on device to generate pairs
 Solution: Add verify button and utilize existing secure I/O functionality
• Verify button directly invokes skinit on fixed memory location containing
executable and arguments
• Human verifies hash of target executable displayed in trusted area of screen
• Eliminates need for challenge response pairs
• Requires further modifications to next generation hardware
Outline
 Vision
 Untrusted Terminal Problem
• Binary and Generalized






Hardware-based HVCE
Software-based HVCE
Building verifiable applications
User study
Ongoing Work
Conclusion
Software-Based Approach
 Extend current software-based attestation primitives
• Establish trusted input and output paths for human verifier
 Advantages of software technique:
• Works on legacy devices
• Software-based solutions can be fixed if vulnerability
discovered
• TPMs require hardware changes to fix vulnerabilities (SHA-1)
• Avoids privacy concerns with remote attestation
 Disadvantages
• Difficult to reason about security
Software-based Attestation Overview
 External, trusted verifier knows expected memory content
of device
 Verifier sends challenge to untrusted device
• Assumption: attacker has full control over device’s memory before
check
 Device returns memory checksum, assures verifier of
memory correctness
Challenge
`
Expected device
memory content
External
Verifier
Checksum of memory
Remote
Platform
Device
memory
Verifiable Code Execution with Pioneer

First step to address untampered code execution
on untrusted legacy hosts
•

Implemented on Intel Pentium IV
Approach
1. Verify code integrity through software-based attestation
2. Set up untampered code execution environment
3. Execute code
Assumptions and Attacker Model
 Assumptions on verifier
• Knows hardware configuration of remote device
 Assumptions on device (untrusted host)
• Hardware and firmware is trustworthy
• Can only communicate with verifier: no proxy attacks
 Attacker controls device’s software and OS before
verification
Verification Function
Design of Pioneer Function
Measure
Integrity
Checksum Code
Hash
Function
Root of Trust
• Compute checksum
• Set up untampered
execution environment
Invoke
Target Code
The Pioneer Protocol
t1: nonce, input
t2: cksum
`
hash
Verifier
• Successful verification if:
t2 – t1 < expected time &&
cksum == exp. cksum
output
Remote Platform
nonce
Checksum Code
nonce
Hash
Function
input
Target Code
cksum
hash
output
Checksum Requirements
 Optimal implementation: code cannot be optimized
• Denali project @ HP labs provides proof of optimal
implementation of short pieces of code
• GNU superopt
• Open challenge to prove optimality of checksum function
 No algebraic optimizations
• Checksum has to be computed in entirety
• Given a memory change, checksum cannot be “adjusted”
without recomputation
Desired Security Property
 Verifier’s check is successful if and only if
• Verification function is unmodified
• Untampered execution environment is established
 Intuition: Checksum is incorrect or checksum
computation slows down if attacker
• Modifies verification function and forges correct checksum,
or
• Fakes creation of untampered code execution environment
Challenges on Modern CPUs
 Execution time non-determinism
• Out-of-order execution
• Cache and virtual memory
• Dynamic CPU clock speed scaling
 DMA-based attacks from malicious peripherals
 Interrupt-based attacks
• Non-maskable interrupts
 Attacks using exceptions
 Virtualization-based attacks
Results – Runtime Difference
Results – Runtime Difference
Results – Runtime Difference
Results – Runtime Difference
Results – Runtime Difference
Results – Runtime Difference
.3ms
Enabling a Human Verifier
 Humans have limited powers of
perception and computation
 Challenges
• Can’t create secure random nonces
• Can’t compute correct checksum
• Can’t accurately time checksum
computation (.3ms?)
Enabling a Human Verifier
 Provide user access to list of
challenge response pairs and
detection threshold
• Similar to two factor authentication in
European online banking systems
 Increase time of checksum
computation
 Human measures checksum time
using stopwatch
Challenge-Response Pair
Generation and Management
 Pairs must be unpredictable and secret
 Threats to generation process include:
• Pair creator could be malicious
• Legitimate challenge-response pair leaked
 Only require one pair to bootstrap creation of list
• Issue challenge to device
• Check response
• Generate additional pairs in untampered execution
environment
 Recommend maintain a list of challenge-response
pairs
Challenge-Response Pair
Generation and Management
 Manufacturer could generate initial
pairs and include with device
• Use isolated machine with tamper-evident
hardware and specialized software
• Does not require additional trusted entities
• Already trust manufacturer
 Could provide out-of-band method to
acquire additional pairs
• Phone, mail, or SMS
IBM 4758
Increasing Computation Time
 Adversary overhead on x86 is around 2%
• Running checksum longer isn’t reasonable or usable
• Even with a 10min checksum we only see 12sec overhead
– 2% of 10mins = 12sec adversary overhead
 Solution required a simpler architecture
• Redesigned Pioneer for Intel XScale-PXA255 processor
• 16 general purpose registers, 2 processor status registers, seven
processor modes
• Simple architecture reduces number of possible attacks and
increases adversary overhead
 Fastest known attack on XScale implementation (memory
copy attack) has overhead of close to 40%
Implementation
 Sharp Zaurus PDA SL-6000
• Intel XScale CPU
• Linux kernel 2.4.19
 Implemented as kernel module
• Checksum loop is assembly,
remainder is C
• Using 80-bit challenges and
responses
Security Analysis
 Probabilistic guessing attacks
• Device attempts to gain pre-computation advantage by guessing
remaining characters of challenge as it is input
• Exploits fact that humans may slowly input challenge
• Countermeasure: Use realistic bounds on attacker’s overhead
 Challenge-response pair secrecy, integrity, and freshness
• Various threats including malicious printers and eavesdropping
 Social engineering attacks
• Malicious program ask user to press return after entering challenge
• Malicious program beeps “early” to signal completion
• Denial of Challenge Attack
• Malicious software consistently crashes the device after observing
challenges
Outline
 Vision
 Untrusted Terminal Problem
• Binary and Generalized






Hardware-based HVCE
Software-based HVCE
Building verifiable applications
User study
Ongoing Work
Conclusion
Secure Digital Signature Application
 Scenario: User wants to sign a message using a private
key encrypted under a password
 Steps to sign message with PGP application
•
•
•
•
Start signature application and enter password
Application decrypts private key using password
Application uses decrypted private key to sign message
Application erases decrypted key in memory and exits
 Avenues of attack include:
• Malware can steal password used to decrypt private key
• Malware can capture decrypted private key in memory
• Malware can perform a message substitution attack and sign a
different message
Secure Digital Signature Application
 Developed verifiable digital signature using software-based
HVCE
• Implemented as a kernel module
• User loads kernel module and issues challenge
• User checks response and execution time
 Digital signature application executes in an untampered
execution environment
 Untampered execution environment provides guarantee
that malware did not steal password or private key through
eavesdropping or memory monitoring
Digital Signature Application User Study
 12 users with various levels of experience with PDAs
• 7 male and 5 female
• 5 between 18-22, 5 between 23-30, and 2 older than 31
 Implementation: Linux kernel of Sharp Zaurus PDA
 Provided list of challenge-response pairs
• asjmxk
• tpaskon
20fa667fdf059980
ace665f453199a0b
30.2s
30.2s
 Provided four test scripts to user
• 2 used correct verification function
• 1 maliciously computed correct checksum with a delay
• 1 maliciously returned incorrect checksum on time
 Told users to accept if checksum if correct and running time < 42s
 Results:
• False Positives
• False Negative (delayed execution)
• False Negative (incorrect checksum)
0/24
0/12
1/12
Verification of Voting Machines
 Challenge: verify correctness
of software on voting machine
• Incident: wrong software was
executing on voting machine on
election day
 Approach: leverage humanverifiable code execution
protocol to ensure correctness
of software on voting machine
Verification of Voting Machines
 County generates challenge-response pairs
• Generated by isolated computer
• Can be validated by public after election
 Challenge-response pair lists sent to poll workers
 Poll workers periodically tests voting machine SW
• Type challenge into voting machine
• Measure computation time
• Verify response
 Provides stronger guarantees than random
auditing of machines
Ongoing Work
 Improving usability of softwarebased technique
 Developing techniques to check a
thin security-visor which then
enforces policy on programs
• Increases TCB
 Increase the number of verifiable
applications
 Beginning standardization with NIST
$2.99 online
Conclusions
 Described design of hardware and software-based systems
for human-verifiable code execution
• Software-based human-verifiable code execution feasible on simple
architectures
 HVCE can provide trusted path for input and output
• Strong guarantees even to a human verifier
 HVCE enables development of verifiable applications
• Described a sample verifiable application to securely sign messages
on a untrusted terminal
 Software-based HVCE can be used today to enable
verification of voting machine software integrity without
hardware changes
Related Work
 Hardware-based untampered execution
• Cerium: µkernel running inside caches of a tamper-resistant CPU
[Chen et al.]
• XOM: Mutually untrusting applications execute on (possibly)
malicious OS [Lie et al.]
• Mondrix: Fine-grained memory protection for Linux using the
Mondriaan system [Witchel et al.]
 HW-based attestation
• Load-time attestation using TCG specs [Sailer et al.]
• Terra: Use trusted VMM for attestation [Garfinkel et al.]
• Copilot: Use add-in PCI card for attestation [Petroni et al.]
 Software-based attestation
• Genuinity: Uses CPU architecture features to distinguish between
execution on actual hardware and a simulator [Kennell et al.]
 Software tamperproofing
Download