A Cryptographic Model for Access-Control Shai Halevi, Paul Karger, Dalit Naor Also Information-flow Aspects of Cryptographic Models Shai Halevi, Manoj Prabhakaran, Yael Tauman Kalai A Typical Cryptographic Model 2 October 14, 2004 Reality… 3 October 14, 2004 This talk o Trying to reconcile “trust models” in cryptography, access-control o Cryptographic models Access-control models Something in between Case study: object storage The issue: Distributed storage servers The (obvious?) protocol: Capabilities • 4 Delegation: problem and solution What did we get? October 14, 2004 Models October 14, 2004 5 Crypto Models: Probabilistic Games o Participants o Interfaces o Interaction between participants Things that can be observed by environment Rules of the game 6 Honest players: run the prescribed protocol Adversarial players: arbitrary (PPT) In between … Initial setup, scheduling, timing, randomness, … October 14, 2004 Real /Abstract World Paradigm [GMW86 … Ca01 …] o Real-world probabilistic game o Abstract-world probabilistic game o The capabilities of an attacker The abstraction that we want to realize The standard spell real-world adversary abstract-world adversary s.t. the observable interfaces look the same Similar to proving that a program meet its specification • 7 But for the presence of the arbitrary adversaries October 14, 2004 An Example: Secure Channels [CK02] o Real world: picture from above o Abstraction: a tunnel through the network 8 But players share secret keys But can drop messages, see when messages are sent (and their length) We would prefer a “perfect tunnel” abstraction, but cannot realize it October 14, 2004 Implied Trust Model: Discretionary Access-Control o Secrets/objects/messages belong to users o Users have discretion to control access to their objects o I’m willing to send my file to Joe, but not Dan I’ll encrypt it so Dan cannot read Once a secret got to 9 Secret = something the user knows and does not If Joe , we lost the game cooperates with , oops October 14, 2004 Mandatory Access Control o Secrets/objects belong to “the system” o o Users have clearance levels Labels define Information-flow limitations 10 secret = an object that is marked `secret’ (think `secret’ vs. `unclassified’) A process of `unclassified’ user should never get a `secret’ object That’s called confinement [La73] October 14, 2004 The Fallacy of Trusting High Clearance o “To give a secret object to Bill, you must trust that Bill is honest” o o Running on behalf of a `secret’ user ≠ not being malicious Example: Bill edits the corporate strategy document (surely a `secret’ object) 11 That’s a fine sentiment when Bill is a person But the object is given to a computer process Using MS-Word, infected with the latest virus October 14, 2004 Access-Control Policy for Confinement [BL73] [Wa74] o Reading `secret’ object requires a process with `secret’ clearance o A `secret’ process can only write `secret’ objects 12 More generally, process at level x can only read objects at levels x and below More generally, process at level x can only write objects at level x Bill’s MS-Word virus cannot leak the secret document by writing it to an `unclassified’ file October 14, 2004 Enforcing the Policy o We must have “trustworthy components” Trustworthy = we really really really believe that they are not infected with viruses • o They have to be at the entry-point to the network 13 Because they are small and simple, and we stared at their code long enough to be convinced that it is not buggy A process with unmitigated Internet access cannot be confined October 14, 2004 Trustworthy Components o The OS kernel is not a good candidate o Neither is an application on top of the OS o It’s typically quite large, complex Usually not that hard to infect it with viruses Cannot be trusted more than their OS Maybe special-purpose network cards You can buy “evaluated” network cards today Evaluated = someone went through the trouble of convincing a third-party examiner that there are no bugs • 14 May include code proving October 14, 2004 The Modified Real-World Model Angel-in-a-box: small, simple, trustworthy 15 October 14, 2004 Achieving Confinement o encrypts outgoing communication, decrypts incoming communication o Note: , ‘s can still communicate using timing / traffic-analysis 16 E.g., using IPSec Secret encrypts using the key of `secret’ Unclassified is not given the key of `secret’ The current model does not deal with those Sometimes they can be dealt with Sometimes the application can live with them October 14, 2004 The Abstract-World Model o o Similar changes, the ‘s have secure channels between them Some subtleties o More interesting questions arise when dealing with more involved models 17 Treat traffic analysis as a resource So abstract-world adversary cannot do “more traffic analysis” than in the real world E.g., generic secure function evaluation October 14, 2004 Object Storage October 14, 2004 18 In The Beginning o There was local storage … o … and then file servers, o But the server was always too busy 19 being in the critical path of every I/O October 14, 2004 Storage Area Networks (SANs) o o Many clients, many disks Server may still keep track of what file goes where + allocation tables, free space, etc. o But it is not on the critical path for I/O o No capacity for access control 20 Dumb disks: obey every command A misbehaving client cannot be stopped October 14, 2004 Object Storage [CMU96…Go99] o Smarter disks: o Need “meta-data server” o 21 Understand files (objects) Know how to say no But don’t have global view knows what object goes where decides on access-control But not in the critical I/O path Disks should enforce server’s decisions October 14, 2004 Capabilities [DvH66] o Client gets a “signed note” from server o o aka capability Disk verifies capability The holder of before serving command this note is hereby granted Capabilities sent over Over secure channels, so cannot get them permission to read object #13 Good ol’ server o 22 This (essentially) was the T10 proposed standard for object stores October 14, 2004 Capabilities Cannot Enforce Confinement [Bo84,KH84] o Imagine a `Secret’ that wants to leak secrets to an `Unclassified’ U gets write capability to unclassified object #7 • S gets read capability to object #7 • • 23 reads the write capability off object #7 uses write capability to copy secrets into object #7 U gets read capability to object #7 • o copies the capability itself into object #7 reads secrets off object #7 The problem: unrestricted delegation October 14, 2004 Restricting Delegation o Tie capability to clients’ names (and authenticate clients) o What else is there to say? o Controlled delegation is still possible 24 Client #176 is hereby granted permission to read object #13 Good ol’ server E.g., you can give a name to a group of clients October 14, 2004 Security Proof o We want to prove security o 25 Must formally specify real-world, abstraction Real-world model is straightforward October 14, 2004 The Abstraction o Roughly, realized file-server abstraction o But not quite… knows about the different disks can do traffic analysis between server and disks can block messages between server and disks • 26 So access revocation does not work the same way October 14, 2004 Recap o Incorporated information-flow restrictions into cryptographic models New dimension Somewhat related to collusion-prevention [LMS ’04] Many open questions • o Application to the object-store protocol 27 E.g., what functions can be computed “this way” Proposed standard did not address this issue Modified protocol to get confinement October 14, 2004 Morals o We should strive to align cryptography with realistic trust models o When physical assumptions are needed, try to use reasonable ones 28 E.g., try to keep secret keys in processes that can be trusted to keep them secret trusted network cards may be reasonable, “secure blobs” probably are not October 14, 2004