A Type System for Expressive Security Policies David Walker

advertisement
A Type System
for
Expressive Security Policies
David Walker
Cornell University
Extensible Systems
System
Interface
Code
Download, Link & Execute
• Extensible systems are everywhere:
– web browsers, extensible operating
systems, servers and databases
• Critical Problem: Security
PoPL ’00
David Walker, Cornell University
2
Certified Code
Untrusted
Code
Annotations
Secure
Code
Download &
Verify
System
Interface
Link & Execute
• Attach annotations (types, proofs, ...) to
untrusted code
• Annotations make verification of security
properties feasible
PoPL ’00
David Walker, Cornell University
3
Certifying Compilation
• An Advantage
– Increased Trustworthiness
• verification occurs after compilation
• compiler bugs will not result in security holes
• A Disadvantage
– Certificates may be difficult to produce
PoPL ’00
David Walker, Cornell University
4
Producing Certified Code
High-level
Program
Compile
user annotations
Annotated
Program
Optimize
Transmit
PoPL ’00
• Certificate production
must be automated
• Necessary components:
1) a source-level programming
language
2) a compiler to compile,
annotate, and optimize
source programs
3) a transmission language that
certifies security properties
David Walker, Cornell University
5
So Far ...
Type Safe
High-level
Program
Compile
1) a strongly typed source-level
programming language
types
Typed
Program
2) a type-preserving compiler
to compile, annotate, and
optimize source programs
Optimize
Transmit
PoPL ’00
3) a transmission language that
certifies type-safety
properties
David Walker, Cornell University
6
Examples
• Proof-Carrying Code [Necula & Lee]
– compilers produce type safety proofs
• Typed Assembly Language [Morrisett, Walker, et al]
– guarantees type safety properties
• Efficient Code Certification [Kozen]
– uses typing information to guarantee controlflow and memory safety properties
• Proof-Carrying Code [Appel & Felty]
– construct types from low-level primitives
PoPL ’00
David Walker, Cornell University
7
Conventional Type Safety
• Conventional types ensure basic safety:
– basic operations performed correctly
– abstraction/interfaces hide data
representations and system code
• Conventional types don't describe
complex policies
– eg: policies that depend upon history
• Melissa virus reads Outlook contacts list and
then sends 50 emails
PoPL ’00
David Walker, Cornell University
8
Security in Practice
• Security via code instrumentation
– insert security state and check dynamically
– use static analysis to minimize run-time
overhead
– SFI [Wahbe et al],
– SASI [Erlingsson & Schneider],
– Naccio [Evans & Twyman],
– [Colcombet & Fradet], …
PoPL ’00
David Walker, Cornell University
9
This Paper
• Combines two ideas:
– certifying compilation
– security via code instrumentation
• The Result:
– a system for secure certified code
• high-level security policy specifications
• an automatic translation into low-level code
• security enforced by static & dynamic checking
PoPL ’00
David Walker, Cornell University
10
Strategy
• Security Automata specify security
properties [Erlingsson & Schneider]
• Compilation inserts typing annotations &
dynamic checks where necessary
• A dependently-typed target language
provides a framework for verification
– can express & enforce any security
automaton policy
– provably sound
PoPL ’00
David Walker, Cornell University
11
Security Architecture
High-level
Program
Security
Automaton
System
Interface
Compile
Annotate
Secure
Typed
Program
Secure
Typed
Interface
Optimize
Transmit
PoPL ’00
Type
Check
David Walker, Cornell University
Secure
Executable
12
Security Automata
• A general mechanism for specifying
security policies
• Enforce any safety property
– access control policies:
• “cannot access file foo”
– resource bound policies:
• “allocate no more than 1M of memory”
– the Melissa policy:
• “no network send after file read”
PoPL ’00
David Walker, Cornell University
13
Example
read(f)
send
has
read
start
bad
•
•
•
•
read(f)
send
Policy: No send operation after a read operation
States: start, has read, bad
Inputs (program operations): send, read
Transitions (state x input -> state):
– start x read(f) -> has read
PoPL ’00
David Walker, Cornell University
14
Example Cont’d
read(f)
send
has
read
start
bad
read(f)
send
• S.A. monitor program execution
• Entering the bad state = security violation
% untrusted program
send();
read(f);
send();
PoPL ’00
% s.a.: start state
% ok -> start
% ok -> has read
% bad, security violation
David Walker, Cornell University
15
Enforcing S.A. Specs
• Every security-relevant operation has
an associated function: checkop
• Trusted, provided by policy writer
• checkop implements the s.a. transition
function
checksend (state) =
if state = start then
start
else
bad
PoPL ’00
David Walker, Cornell University
16
Enforcing S.A. Specs
• Rewrite programs:
send()
let next_state = checksend(current_state) in
if next_state = bad then
halt
else % next state is ok
send()
PoPL ’00
David Walker, Cornell University
17
Questions
• How do we verify instrumented code?
– is this safe?
let next_state = checksend(other_state) in
if next_state = bad then
halt
else % next state is ok
send()
• Can we optimize certified code?
PoPL ’00
David Walker, Cornell University
18
Verification
• Basic types ensure standard type safety
– functions and data used as intended and
cannot be confused
– security checks can’t be circumvented
• Introduce a logic into the type system to
express complex invariants
• Use the logic to encode the s.a. policy
• Use the logic to prove checks unnecessary
PoPL ’00
David Walker, Cornell University
19
Target Language Types
• Predicates:
– describe security states
– describe automaton transitions
– describe dependencies between values
• Function types include predicates so
they can specify preconditions:
– foo: [1,2,P1(1,2),P2(1)] . 1 -> 2
PoPL ’00
David Walker, Cornell University
20
Secure Functions
• Each security-relevant function has a type
specifying 3 additional preconditions
• eg: the send function:
– P1: in_state(current_state)
– P2: transitionsend(current_state,next_state)
– P3: next_state  bad
Pre: P1 & P2 & P3
Post: in_state(next_state)
• The precondition ensures calling send won’t
result in a security violation
PoPL ’00
David Walker, Cornell University
21
Run-time Security Checks
• Dynamic checks propagate information
into the type system
• eg: checksend(state)
Post: next_state.
transitionsend(state,next_state)
& result = next_state
• conditional tests:
if state = bad then % assume state = bad ...
else % assume state  bad ...
PoPL ’00
David Walker, Cornell University
22
Example
% P1: in_state(current_state)
let next_state = check_send(current_state) in
% P2: transitionsend(current_state,next_state)
if next_state = bad then halt
else
% P3: next_state  bad
send() % P1 & P2 & P3 imply send is ok
PoPL ’00
David Walker, Cornell University
23
Optimization
• Analysis of s.a. structure makes
redundant check elimination possible
– eg:
read(f)
send
has
read
start
bad
read(f)
send
– supply the type checker with the fact
transitionsend(start,start) and verify:
if current = start then send(); send (); send (); …
PoPL ’00
David Walker, Cornell University
24
Related Work
• Program verification
– abstract interpretation, data flow & control flow
analysis, model checking, soft typing, verification
condition generation & theorem proving, ...
• Dependent types in compiler ILs
– Xi & Pfenning, Crary & Weirich, ...
• Security properties of typed languages
– Leroy & Rouaix, ...
PoPL ’00
David Walker, Cornell University
25
Summary
• A recipe for secure certified code:
– types
• ensure basic safety
• prevent dynamic checks from being circumvented
• provide a framework for reasoning about programs
– security automata
• specify expressive policies
• dynamic checking when policies can’t be proven
statically
PoPL ’00
David Walker, Cornell University
26
Download