Cryptographic Model for Access-Control

28 slides
0.3 MB

Similar Presentations

Presentation Transcript


A Cryptographic Model for Access-ControlShai Halevi, Paul Karger, Dalit NaorInformation-flow Aspects of Cryptographic ModelsShai Halevi, Manoj Prabhakaran, Yael Tauman Kalai Also


A Typical Cryptographic Model




This talkTrying to reconcile “trust models” in cryptography, access-control Cryptographic models Access-control models Something in between Case study: object storage The issue: Distributed storage servers The (obvious?) protocol: Capabilities Delegation: problem and solution What did we get?




Crypto Models: Probabilistic Games Participants Honest players: run the prescribed protocol Adversarial players: arbitrary (PPT) In between … Interfaces Interaction between participants Things that can be observed by environment Rules of the game Initial setup, scheduling, timing, randomness, …


Real /Abstract World Paradigm [GMW86 … Ca01 …]Real-world probabilistic game The capabilities of an attacker Abstract-world probabilistic game The abstraction that we want to realize The standard spell  real-world adversary  abstract-world adversary s.t. the observable interfaces look the same Similar to proving that a program meet its specification But for the presence of the arbitrary adversaries


An Example: Secure Channels [CK02]Real world: picture from above But players share secret keysAbstraction: a tunnel through the network But can drop messages, see when messages are sent (and their length) We would prefer a “perfect tunnel” abstraction, but cannot realize it


Implied Trust Model: Discretionary Access-ControlSecrets/objects/messages belong to users Secret = something the user knows and does not Users have discretion to control access to their objects I’m willing to send my file to Joe, but not Dan I’ll encrypt it so Dan cannot read Once a secret got to , we lost the game If Joe cooperates with , oops


Mandatory Access ControlSecrets/objects belong to “the system” secret = an object that is marked `secret’ (think `secret’ vs. `unclassified’) Users have clearance levels Labels define Information-flow limitations A process of `unclassified’ user should never get a `secret’ object That’s called confinement [La73]


The Fallacy of Trusting High Clearance“To give a secret object to Bill, you must trust that Bill is honest” That’s a fine sentiment when Bill is a person But the object is given to a computer process Running on behalf of a `secret’ user ≠ not being malicious Example: Bill edits the corporate strategy document (surely a `secret’ object) Using MS-Word, infected with the latest virus


Access-Control Policy for Confinement [BL73] [Wa74]Reading `secret’ object requires a process with `secret’ clearance More generally, process at level x can only read objects at levels x and below A `secret’ process can only write `secret’ objects More generally, process at level x can only write objects at level x Bill’s MS-Word virus cannot leak the secret document by writing it to an `unclassified’ file


Enforcing the PolicyWe must have “trustworthy components” Trustworthy = we really really really believe that they are not infected with viruses Because they are small and simple, and we stared at their code long enough to be convinced that it is not buggy They have to be at the entry-point to the network A process with unmitigated Internet access cannot be confined


Trustworthy ComponentsThe OS kernel is not a good candidate It’s typically quite large, complex Usually not that hard to infect it with viruses Neither is an application on top of the OS Cannot be trusted more than their OS Maybe special-purpose network cards You can buy “evaluated” network cards today Evaluated = someone went through the trouble of convincing a third-party examiner that there are no bugs May include code proving


The Modified Real-World ModelAngel-in-a-box: small, simple, trustworthy


Achieving Confinement encrypts outgoing communication, decrypts incoming communication E.g., using IPSec Secret encrypts using the key of `secret’ Unclassified is not given the key of `secret’Note: , ‘s can still communicate using timing / traffic-analysis The current model does not deal with those Sometimes they can be dealt with Sometimes the application can live with them


The Abstract-World ModelSimilar changes, the ‘s have secure channels between them Some subtleties Treat traffic analysis as a resource So abstract-world adversary cannot do “more traffic analysis” than in the real world More interesting questions arise when dealing with more involved models E.g., generic secure function evaluation


Object Storage


In The BeginningThere was local storage …… and then file servers, But the server was always too busy being in the critical path of every I/O


Storage Area Networks (SANs)Many clients, many disks Server may still keep track of what file goes where + allocation tables, free space, etc. But it is not on the critical path for I/ONo capacity for access control Dumb disks: obey every command A misbehaving client cannot be stopped


Object Storage [CMU96…Go99]Smarter disks: Understand files (objects) Know how to say no But don’t have global view Need “meta-data server” knows what object goes where decides on access-control But not in the critical I/O path Disks should enforce server’s decisions


Capabilities [DvH66]Client gets a “signed note” from server aka capability Disk verifies capability before serving command Capabilities sent over Over secure channels, so cannot get them This (essentially) was the T10 proposed standard for object stores The holder of this note is hereby granted permission to read object #13Good ol’ server


Capabilities Cannot Enforce Confinement [Bo84,KH84]Imagine a `Secret’ that wants to leak secrets to an `Unclassified’ U gets write capability to unclassified object #7 copies the capability itself into object #7 S gets read capability to object #7 reads the write capability off object #7 uses write capability to copy secrets into object #7 U gets read capability to object #7 reads secrets off object #7 The problem: unrestricted delegation


Restricting DelegationTie capability to clients’ names (and authenticate clients)Controlled delegation is still possible E.g., you can give a name to a group of clients Client #176 is hereby granted permission to read object #13Good ol’ serverWhat else is there to say?


Security ProofWe want to prove security Must formally specify real-world, abstraction Real-world model is straightforward


The AbstractionRoughly, realized file-server abstraction But not quite… knows about the different disks can do traffic analysis between server and disks can block messages between server and disks So access revocation does not work the same way

Browse More Presentations

Last Updated: 8th March 2018

Recommended PPTs