ISA 662 Information System Security

advertisement
4. Security Policies
-
Chapter 4 of Bishop-
1
Security Policy
A a statement that partitions all possible system
states into:



Authorized (secure) states
Unauthorized (non secure) states
A secure system


Starts in any authorized state
Never enters unauthorized state
S1
S2
S3
1. Statement
means what?
2. State-based
Requirements
Specification
S4
Authorized, Unauthorized
Starting from S2
2
The “CIAs” of Security

Confidentiality: Prevent unauthorized subjects
from obtaining information


Privacy?
Availability: Have to provide information when
an authorized entity asks for it.


Why so important?
Why so important
Disaster?
Integrity: Information may not be changed in
unauthorized ways

Why so important?
Loss of Valuable data?
3
Confidentiality


X : a set of subjects, I : information
I has confidentiality property w.r.t. X if no x  X
can obtain information about I


Example: X = students, I = final exam questions
Subtleties

Why “w.r.t. X ” ?



I may not be confidential to students of in another class
Why “information about I ” ?
What is obtaining?


Suppose “no question in I is easier than 4.12”
But “question 4.12 is not included in I ”
4
Integrity



Integrity = Prevent unauthorized modification of
information
I has integrity property w.r.t. X if all x  X trust
information in I
Types of integrity:

Trust the origin / identity of I (origin integrity or
authentication)


Trust the conveyance / storage of I (data integrity)


popular software hosted by an unpopular website
Trust the specification of a resource I (assurance)


Phishing
software with backdoor
Is trust a mechanism to ensure integrity?
5
Availability


I has availability property w.r.t. X if all x  X can
access I
‘can access’ usually means ‘can access in
reasonable time’


Psychologically acceptable web response time: 8
second 2
DOS (DDOS) slows down instead of crashing

Maybe to a sluggish speed
6
Policies and their Models

policy = statement of safe states


Abstract description of policy = abstract statement of safe states
Basic things specified in policies

Confidentiality



Integrity



Prohibit direct (rights leakage) or indirect (information flow)
disclosure of information under temporal conditions
Example: None should read other’s homework; one should not
write one’s answers elsewhere; before in-class discussion”
Whether can be altered; if so, under what conditions and how?
Example: the software may be distributed without modification
Availability


What service must be provided and at what quality (QoS)
Example: a response must reach the browser within 8 seconds
7
Policy: Many dimensions

Formal vs. Informal



Formal = using a formally specified syntax
Informal = human language or pictures
Degrees of formality

Used by linguists


Used by congressional policy makers, lawyers, etc. to
provide a sense of non-ambiguity to the executive and
judiciary branches


Need to be amenable to linguistic argument and stand up to
academic criticism
Failure model: the Supreme court
No clear division of good and bad
8
From Formal to Machine Understandable?

Why formality?


Desire for non ambiguity
What can we do with formal policies?

Analyze them for



Consistency
 Are they non-contradictory?
Completeness
 Do they cover all the cases?
How do we analyze them?

By committee? By hand? By machine?
9
Formal Policies during/after Analysis

During analysis:



What happens after analysis?


Can we use machines to analyze?
Many proof systems, calculi, etc
Enforce them
Machine enforcement


Requires executable polices!
Executable policies vs. executable code
10
Behavioral Policies

Governing behavior of:



What subjects can/cannot do.
Example: statements about socially acceptable
behavior


Subjects, objects, and actions
Equal opportunity, affirmative action, discrimination
Safety vs. Liveliness

Bad things to avoid:


Do not divulge private information
Good things that must be done

Always inform change to user groups; new obligations placed
on lenders etc.
11
Traditional Examples of Security Policies

Military (DoD) security policy



Commercial security policy



Policy primarily protect confidentiality
Behavior against attack; Privacy act (and HIPAA)
Policy primarily protect integrity and availability
Student account server
But

Confidentiality policy means protecting only
confidentiality

Integrity policy means protecting only integrity
12
Integrity vs. Consistency

Integrity w.r.t. modifications in malign situations


A software with a Trojan horse in it
Consistency w.r.t benign situations

In a distributed system





Want to draw $20 from an ATM
Bank debits $20 from your account
ATM powers down before it gives you money
Customer lose $20?
 Not really, why?
It’s a transaction
Two-phase commit

Not a traditional security mechanism
13
Trust



a: reliance on character, ability, strength, or truth of
someone
b : one in whom confidence is placed
dependence and reliance on future payment for
property delivered
14
Trust and Security


Trust a basic assumption of truthfulness? (circular)
Axiomatic:



need to begin by assuming that something is true
Build the rest based on some assumption
Basis for trust:


Belief (could be social, blind faith, unfounded or even unjustified)
Experience




Prior knowledge of good behavior
Reputation based trust
Credit score, from several credit agencies.
Up to the individual to determine how to use the data in making a
judgment.
15
Roles of Trust in system management
(from the textbook)
1.
2.
Administrator installs a patch
Trusts patch came from vendor, not tampered
with in transit




Example: fake Windows patch (see Appendix 1
p36)
vendor tested patch thoroughly
vendor’s test environment corresponds to local
environment
patch is installed correctly
16
Formal Verification

Formal verification gives mathematical proof that given
input i, the protection system P works as specified



“I have invented an encryption algorithm, and I proved it to be
perfectly secure.”
Not a concept of trust. Only a proof system that uses
some axioms
Important points:




What are the assumptions of the proof?
What is the final claim of the proof?
Does the final claim match the context in which the result is
used?
How complex is the proof?
17
Why believe inFormal Methods?
Proofs have no errors [Note: infamous examples of
incorrect proofs]
Preconditions hold in actual environment
Transformed into executable code whose actions
follow source code.
1.
2.
3.
1.
2.
Not true in general
Hard to do.
Compiler bugs, linker/loader/library problems
No Hardware faults





Real story of Ken Thompson (See Appendix 2 p37)
Hardware executes the program as intended
Hardware bug (Pentium f00f bug
(http://www.x86.org/errata/dec97/f00fbug.htm) , for example)
18
Overview





Policies
The Role of Trust
Types of Access Control
Policy Expression Languages
Limits on Precise Security Mechanisms
19
Common Types of Access Control

Discretionary Access Control (DAC, IBAC)



individual user sets access control mechanism to
allow or deny access to an object
Modeled with Access Control Matrix
Mandatory Access Control (MAC)



System mechanism controls access to object, and
individuals cannot alter that access
Military and governmental reflecting strictly
hierarchical organizations
Modeled with lattices
20
Role based access control



Popular among business, military worlds
3 Basic entities: Subjects, Roles, Permissions
2 Basic mappings:




Subject  Role,
Role  Permission
A subject gets all permissions assigned to a role
Constraints taken as binary
Subjects
Roles
Subject
to role
mapping
Role to
permissions
mapping
Permissions
21
Types of Access Control
Another Popular Concept
 Originator Controlled Access Control (ORCON)




The originator (creator/owner) of information controls
who can access it
The own right no longer implies control of rights
Copyright, Patent
Some things to remember

Delegation of



Rights
Ownership
Right to recall the system or its rights
22
Other Issues about Access Control



Identity based systems
Attribute based systems
Credential based access control


Different from capability based systems
Distributed access control

Certificate schemas


Chasing credential chains etc
Federations and loosely coupled organizations

Some aspects of Kerberos
23
Overview





Policies
The Role of Trust
Types of Access Control
Policy Expression Languages
Limits on Precise Security Mechanisms
24
Why Policy Expression Languages?


Natural language-based policies ambiguous?
Example:

“a student may not copy other student’s homework”
What does ‘copy’ mean? And what if the date is past?

To Express policies unambiguously

Requires precise languages. Options:



Mathematical logic
Programming-like languages
Rule languages
25
Policy Languages at Different Levels

High-level languages



Low-level languages



Policies expressed abstractly
Entities of the enforcement mechanism not part of
the policy expression language syntax
Policy constraints in terms of program options, input,
or specific system characteristics
More closely tied to enforcement mechanisms
Borrowing from programming languages:

Need to pass context and scope independent of the
enforcement level – Fully abstract
26
From high to low level


Policy refinement: Enforcement requires polices to be
translated to formal syntax describing operational
semantics of enforcement environment.
Example: High level: Every student can read his/her own data
Low level: Need to be written as an ACL or some other constraint on
process behavior.
High Level Policies
Applies to selected/all environments
Translation procedure can be
parameterized by polices
Low level operational constraints or
metadata+behavior specification
27
Purity Measures on Refinement

Fully Abstraction:




Suppose P, Q are higher level policies and
F: (higher level) -- (lower level) is a refinement.
Then f is fully abstract if: [ P = Q] iff [F(P) ~ F(Q)]
Here, = is equality of higher level policies and ~ is
equality of lower level policies.
(a definition cooked for this lecture)
For proper definition from programming languages
see http://www.fabfac.org/intro.html
Compositionality: F is compositional iff
F(PUQ) ~ F(P) ÜF(Q)
28
Example 1: Policy Language for Java



Goal: restrict actions of Java programs that are
downloaded and executed by a web browser
Expresses constraints as conditions restricting
creation of Java class or invocation of methods
(of class)
Independent of enforcement mechanisms
(Windows and Unix would need different
mechanisms)
29
Example 1: Syntax of the Language

Subjects (objects) are classes, methods



Operations



Class: file, socket
Method: file.read()
Instantiation: creates instance of class, e.g. -| socket
Invocation: executes methods, e.g. |-> file.read
Access constraints



deny(s op s’ ) when b
While b is true, subject s cannot perform op on
(subject or object) s’
empty s means all subjects
30
Example 1: Application Scenarios


Downloaded Java program cannot access
password file

deny( |-> file.read) when
(file.getFileName()==“/etc/passwd”)
The program cannot open network connection
when there are already 100+ connections

deny(|- socket) when
(network.numerOfConns >= 100)
31
Example 2: DTEL


Domain-type enforcement language (DTEL)
The policy has three parts:





Assign processes (principles/subjects) to domains
Assign files (objects) to types
Specify rights that domains have over types
The model types all files
Access granularity restricted to type (names)
32
Example 2: DTEL Syntax

Domains:





d_user
d_admin
d_login
d_daemon
(ordinary user processes)
(administrator processes)
(login processes)
(system daemons)
Types:





t_sysbin
t_readable
t_writable
t_dte
t_generic
(executable system files)
(readable files)
(writable files)
(data used by DTEL)
(data generated by user processes)
33
Example 2: DTEL Syntax (Cont’d)

Policy in English: only administrator processes
can write to system binaries; others cannot
domain d_admin = (/usr/bin/sh, /usr/bin/csh, /usr/bin/ksh),
(crwxd->t_generic),
(crwxd->t_readable, t_writable, t_dte,
t_sysbin),
(sigtstp->d_daemon);

(The last line says a process in d_admin can
suspend a daemon process)
domain d_user = (/usr/bin/sh, /usr/bin/csh, /usr/bin/ksh),
(crwxd->t_generic),
(rxd->t_sysbin),
(crwd->t_writable),
(rd->t_readable, t_dte);
34
Example 3: X Window Access Policy


UNIX X11 Windowing System
Access to X11 display controlled by list

List says what hosts allowed, disallowed access
xhost +groucho -chico



Connections from host groucho allowed
Connections from host chico not allowed
Properties of the syntax

Allows permissions and prohibitions

That is positive (+) and negative (-) permissions on acceses
35
Example 4: Policy in English

GMU Responsible Use of Computing Policy

http://www.gmu.edu/catalog/0001/genpoli2.html
36
XACML: Specifying Access Control in XML



Defined by an OASIS Technical Committee
XACML is a markup language for specifying
access control language to XML formatted
documents
Example on next page:

Taken from
http://www.idealliance.org/papers/dx_xmle04/papers
/04-01-04/04-01-04.html
37
<Rule RuleId=""
Effect="Permit">
<Description> John can open the door. </Description>
<Target>
<Subjects>
<Subject> <SubjectMatch
MatchId="urn:oasis:names:tc:xacml:1.0:function:string-equal">
<AttributeValue
DataType="http://www.w3.org/2001/XMLSchema#string">John
</AttributeValue>
<SubjectAttributeDesignator
AttributeId="urn:oasis:names:tc:xacml:1.0:subject:subject-id"
DataType="http://www.w3.org/2001/XMLSchema#string"/>
</SubjectMatch>
</Subject>
</Subjects>
<Resources>
<Resource> <ResourceMatch
MatchId="urn:oasis:names:tc:xacml:1.0:function:anyURI-equal">
<AttributeValue DataType="http://www.w3.org/2001/XMLSchema#anyURI">door
</AttributeValue> <ResourceAttributeDesignator
AttributeId="urn:oasis:names:tc:xacml:1.0:resource:resource-id"
DataType="http://www.w3.org/2001/XMLSchema#anyURI"/>
</ResourceMatch>
</Resource>
</Resources>
<Actions>
<Action>
<ActionMatch MatchId="urn:oasis:names:tc:xacml:1.0:function:string-equal">
<AttributeValue
DataType="http://www.w3.org/2001/XMLSchema#string">open
</AttributeValue>
<ActionAttributeDesignator AttributeId="urn:oasis:names:tc:xacml:1.0:action:action-id"
DataType="http://www.w3.org/2001/XMLSchema#string"/>
</ActionMatch>
</Action>
</Actions>
</Target>
</Rule>
38
Precise Enforcement of Policies


After we have a policy, does there always exist
mechanisms to enforce the policy?
If so, can we devise a generic procedure for
developing such mechanisms?
set of reachable states
with mechanisms
secure
precise
set of secure states
39
The A. Jones + R. Lipton Model




A program p is modeled as a function
p: I1 x I2 x...x In  r
Assumption on Observability
All information available about I1 x I2 x...x In are encoded
in the function p(I1,I2, In)
A protection mechanism:
Let p: I1 x I2 x...x In  r be a function.
m(I1,I2, In) = p(I1,I2, In) or
m(I1,I2, In) e E
That is, m produces the same output as p or an error.
40
A Model of Mechanisms


Objective is to secure a program p that takes
inputs I1 I2 ... In and outputs some r
A protection mechanism m takes the same
inputs I1 I2 ... In and outputs either the same r
or some error e
set of reachable states without
mechanisms
set of secure states
41
The A. Jones + R. Lipton Model Cont.



Definition: A confidentiality policy for
p: I1 x I2 x...x In  r is a function
c: I1 x I2 x...x In  A where A is a subset of I1xI2x...xIn
Definition: A confidentiality policy c is secure with
respect to a security mechanism m iff there is a function
m’: A  R U E satisfying
m(i1,i2,in)= m’(c(i1,i2,in))
Example: consider a password accepting function auth
with respect to a database Db with output {good, bad}
auth: U x P x Db  {good, bad}, where Db contains pairs of
(u,pwd) that are allowed.
 The the confidentiality policy allow(i1,i2,i3)=(i1,i2). Then there is
NO function auth’ satisfying
auth’(allow(i1,i2,i3))= auth’(i1,i2)= auth(i1,i2,i3)
42
Precision


Mechanisms for enforcing policies are typically
overly-restrictive
m1, m2 are distinct mechanisms for program p
under same policy

m1 as precise as m2 (m1  m2) if, for all inputs i1, …, in , m2(i1,
…, in) = p(i1, …, in)  m1(i1, …, in) = p(i1, …, in)
m1
m2
set of reachable states without
mechanisms
set of secure states
43
Combining Mechanisms

m3 = m1  m2 defined as:


For inputs on which m1 and m2 outputs same value as p, m3
does also; otherwise, m3 returns same value as m1
Theorem: if m1, m2 secure, then m3 secure

Also, m3 mm11 and m3  m2
m2
set of reachable states without
mechanisms
set of secure states
44
Existence Theorem

For any program p and security policy c, there
exists a precise, secure mechanism m* such
that, for all secure mechanisms m associated
with p and c, m*  m

m*= i=1,  mi
mi
set of reachable states without
mechanisms
set of secure states
45
Lack of Effective Procedure

Theorem: There is no effective procedure that
determines a maximally precise, secure
mechanism for any policy and program.

Proof analogous to that of undecidable problem
However, possible to get a maximally precise secure
mechanism for specific cases.
46
Key Points





Policies describe what are (not) allowed
Trust underlies everything
DAC and MAC (ORCON)
Formal languages are required to specify policy
Precise enforcement of policies is generally
difficult
47
Appendix 1: Fake Windows Patch Is a Windows Killer
(Source: http://www.pcmag.com/article2/0,1895,1853366,00.asp) Go back
From: update@microsoft.com
Subject: What You Need to Know About the Zotob.A Worm.
What You Should Know About Zotob
Published: August 14, 2005 | Updated: August 19, 2005 Severity VirusGreen
Supported Software Affected
Windows All Version
Microsoft Security Advisory 899588
Zotob.A
Zotob.B
Zotob.C
Zotob.D
Zotob.E
Bobax.O
Esbot.A
Rbot.MA
Rbot.MB
Rbot.MC
Zotob is a worm that targets All Windows computers and takes advantage of a security issue that was
addressed by Microsoft Security Bulletin MS05-039. This worm installs malicious software, and then searches
for other computers to infect.
If you have installed the update released with Security Bulletin MS05-039, you are protected from Zotob and
its variants. If you are using any supported version of Windows, you are not at risk.
The attachment is named MS05-039.EXE. It is 21,229 bytes and is compressed with the MEW
program. When the attachment is executed, it first downloads a second Trojan program,
Agent.AII, and executes it. This program downloads additional malware which logs keystrokes
and accesses multiple web sites. It also attempts to modify the settings of security programs on
the user's computer.
48
Appendix 2: True Story about a Back Door
Ken Thompson's 1983 Turing Award lecture to the ACM admitted the existence
of a back door in early Unix versions that may have qualified as the most
fiendishly clever security hack of all time. In this scheme, the C compiler
contained code that would recognize when the `login' command was being
recompiled and insert some code recognizing a password chosen by
Thompson. So the compiled Unix system has a backdoor whereas the source
code is clean.
More amazingly, Thompson also arranged that the compiler would recognize
when it was compiling a version of itself, and insert into the recompiled
compiler the hack codes required to get him the password, and also to
recognize itself and do the whole thing again the next time around!
Consequently, when someone suspected the compiler and attempted to
recompile the compiler from a clean source, he had to use the hacked compiler
to recompile the compiler – which would of course be a hacked version again!
The hack perpetuated itself invisibly, leaving the back door in place and active
but with no trace in the sources.
(See full story at http://www.acm.org/classics/sep95/)
49
Download