CSCE 790 – Secure Database Systems

advertisement

CSCE 548

Secure System Standards

Risk Management

Announcement

 Job Openings:

– Daniel Rusu, Dreamgol, LLC

– drusu9@gmail.com

– 803-727-5634

– www.dreamgol.com

CSCE 548 - Farkas 2

Announcements

 Job openings:

Peter J. Johnson, Staffing Consultant

978.927.7000 (m) / premagni@verizon.net

(e) www.linkedin.com/in/peterjjohnson (linkedin)

Massachusetts, Maryland, Virginia, and Ohio for computer scientists and software engineers with credentials in the fields of security, networking and privacy

– US citizenship

CSCE 548 - Farkas 3

Announcement

 Job Openings:

Charleston, system development

C, Python, and Java, Linux, specifically Fedora 14, cellular communications, electrical engineering or the basic principles, and mysql

– Android, iPhone / iPad developers to build the follow on versions of a current geo-location / SMS application

– Contact info upon request

CSCE 548 - Farkas 4

Project

 Requirements available at http://www.cse.sc.edu/~farkas/csce548-

2012/csce548-project-requirements.htm

 Useful links:

– OWASP, Open Web Application Security

Project, https://www.owasp.org/index.php/Main_Page

– Sample projects

CSCE 548 - Farkas 5

Homework 1

Choose a team member among your class mates. This selection is for this exercise only.

List the steps of RMF for the "KillerAppCo's iWare 1.0 Server" given in your text book. (3 points)

Carry out similar RMF on the computing resources owned by your team member. For example, understand the "business" context may include goals like graduating from USC, making profit from writing software to a company, etc. Document your RMF activities and findings. (7 points)

BONUS points (2 points): Have your partner evaluate your risk management report and comment on it.

CSCE 548 - Farkas 6

Reading

 This lecture:

– McGraw: Chapter 2

– Recommended:

 Rainbow Series Library, http://www.fas.org/irp/nsa/rainbow.htm

 Common Criteria, http://www.commoncriteriaportal.org/

 Next lecture:

– Software Development Lifecycle – Dr. J. Vidal

CSCE 548 - Farkas 7

Risk Assessment

Threats

Vulnerabilities

CSCE 548 - Farkas

RISK

Consequences

8

Financial Loss

Dollar Amount Losses by Type

Total Loss (2006): $53,494,290 CSI/FBI Computer Crime and Security Survey

Computer Security Institute

CSCE 548 - Farkas 9

Security Protection

Percentage of IT Budget

Spent on Security

Percentage of Organizations

Using ROI, NPV, or IRR Metrics

CSCE 548 - Farkas

CSI/FBI Computer Crime and Security Survey

Computer Security Institute

10

Real Cost of Cyber Attack

 Damage of the target may not reflect the real amount of damage

 Services may rely on the attacked service, causing a cascading and escalating damage

 Need: support for decision makers to

– Evaluate risk and consequences of cyber attacks

– Support methods to prevent, deter, and mitigate consequences of attacks

CSCE 548 - Farkas 11

System Security Engineering

(Traditional View)

Specify System

Architecture

Identify Threats,

Vulnerabilities, Attacks

Identify and

Install Safeguards

Estimate

Risk

Risk is acceptably low

Prioritize

Vulnerabilities

CSCE 548 - Farkas 12

Risk Management Framework

(Business Context)

Understand Business

Context

Identify Business and Technical Risks

Synthesize and Rank

Risks

Carry Out Fixes and Validate

Define Risk

Mitigation Strategy

Measurement and Reporting

CSCE 548 - Farkas 13

Understand the Business Context

 “Who cares?”

 Identify business goals, priorities and circumstances, e.g.,

– Increasing revenue

Meeting service-level agreements

Reducing development cost

– Generating high return investment

 Identify software risk to consider

CSCE 548 - Farkas 14

Identify Business and Technical

Risks

“Why should business care?”

Business risk

– Direct threat

– Indirect threat

Consequences

– Financial loss

Loss of reputation

Violation of customer or regulatory constraints

– Liability

Tying technical risks to the business context in a meaningful way

CSCE 548 - Farkas 15

Synthesize and Rank the Risks

 “What should be done first?”

 Prioritization of identified risks based on business goals

 Allocating resources

 Risk metrics:

Risk likelihood

Risk impact

Risk severity

Number of emerging risks

CSCE 548 - Farkas 16

Define the Risk Mitigation

Strategy

 “How to mitigate risks?”

 Available technology and resources

 Constrained by the business context: what can the organization afford, integrate, and understand

 Need validation techniques

CSCE 548 - Farkas 17

Carry Out Fixes and Validate

 Perform actions defined in the previous stage

 Measure “completeness” against the risk mitigation strategy

– Progress against risk

Remaining risks

Assurance of mechanisms

 Testing

CSCE 548 - Farkas 18

Measuring and Reporting

 Continuous and consistent identification and storage of risk information over time

 Maintain risk information at all stages of risk management

 Establish measurements, e.g.,

– Number of risks, severity of risks, cost of mitigation, etc.

CSCE 548 - Farkas 19

Assets-Threat Model (1)

Threats compromise assets

Threats have a probability of occurrence and severity of effect

Assets have values

Assets are vulnerable to threats

Threats Assets

CSCE 548 - Farkas 20

Assets-Threat Model (2)

Risk: expected loss from the threat against an asset

R=V*P*S

 R risk

 V value of asset

 P probability of occurrence of threat

 V vulnerability of the asset to the threat

CSCE 548 - Farkas 21

System-Failure Model

Estimate probability of highly undesirable events

Risk: likelihood of undesirable outcome

Threat

System

Undesirable outcome

CSCE 548 - Farkas 22

Risk Acceptance

Certification

 How well the system meet the security requirements (technical)

Accreditation

Management’s approval of automated system

(administrative)

CSCE 548 - Farkas 23

NEXT SLIDES ARE

RECOMMENDED ONLY

CSCE 548 - Farkas 24

Incident Handling

Computer Security Incident Handling Guide,

Recommendations of the National Institute of

Standards and Technology http://csrc.nist.gov/publications/nistpubs/800-

61-rev1/SP800-61rev1.pdf

How to Response?

Actions to avoid further loss from intrusion

Terminate intrusion and protect against reoccurrence

Law enforcement – prosecute

Enhance defensive security

Reconstructive methods based on:

Time period of intrusion

Changes made by legitimate users during the effected period

– Regular backups, audit trail based detection of effected components, semantic based recovery, minimal rollback for recovery.

CSCE 548 - Farkas 26

Roles and Responsibilities

User:

Vigilant for unusual behavior

Report incidents

Manager:

Awareness training

Policies and procedures

System administration:

– Install safeguards

– Monitor system

– Respond to incidents, including preservation of evidences

CSCE 548 - Farkas 27

Computer Incident Response

Team

 Assist in handling security incidents

– Formal

– Informal

 Incident reporting and dissemination of incident information

 Computer Security Officer

– Coordinate computer security efforts

 Others: law enforcement coordinator, investigative support, media relations, etc.

CSCE 548 - Farkas 28

Incident Response Process 1.

Preparation

Baseline Protection

Planning and guidance

Roles and Responsibilities – Training

Incident response team

CSCE 548 - Farkas 29

Incident Response Process 2.

Identification and assessment

– Symptoms

– Nature of incident

Identify perpetrator, origin and extent of attack

Can be done during attack or after the attack

Gather evidences

Key stroke monitoring, honey nets, system logs, network traffic, etc.

Legislations on Monitoring!

– Report on preliminary findings

CSCE 548 - Farkas 30

Incident Response Process 3.

Containment

– Reduce the chance of spread of incident

– Determine sensitive data

– Terminate suspicious connections, personnel, applications, etc.

– Move critical computing services

– Handle human aspects, e.g., perception management, panic, etc.

CSCE 548 - Farkas 31

Incident Response Process 4.

Eradication

– Determine and remove cause of incident if economically feasible

– Improve defenses, software, hardware, middleware, physical security, etc.

Increase awareness and training

Perform vulnerability analysis

CSCE 548 - Farkas 32

Incident Response Process 5.

Recovery

– Determine course of action

– Reestablish system functionality

– Reporting and notifications

– Documentation of incident handling and evidence preservation

CSCE 548 - Farkas 33

Follow Up Procedures

 Incident evaluation:

– Quality of incident (preparation, time to response, tools used, evaluation of response, etc.)

– Cost of incident (monetary cost, disruption, lost data, hardware damage, etc.)

 Preparing report

 Revise policies and procedures

CSCE 548 - Farkas 34

Security Awareness and Training

 Major weakness: users unawareness

 Organizational effort

 Educational effort

 Customer training

 Federal Trade Commission: program to educate customers about web scams

CSCE 548 - Farkas 35

Building It Secure

 1960s: US Department of Defense (DoD) risk of unsecured information systems

 1970s:

– 1977: DoD Computer Security Initiative

– US Government and private concerns

– National Bureau of Standards (NBS – now NIST)

 Responsible for standards for acquisition and use of federal computing systems

Federal Information Processing Standards (FIPS PUBs)

CSCE 548 - Farkas 36

NBS

 Two initiatives for security:

– Cryptography standards

 1973: invitation for technical proposals for ciphers

 1977: Data Encryption Standard

 2001: Advanced Encryption Standard (NIST)

Development and evaluation processes for secure systems

 Conferences and workshops

 Involves researchers, constructors, vendors, software developers, and users

 1979: Mitre Corporation: entrusted to produce an initial set of criteria to evaluate the security of a system handling classified data

CSCE 548 - Farkas 37

National Computer

Security Center

 1981: National Computer Security Center (NCSC) was established within NSA

– To provide technical support and reference for government agencies

– To define a set of criteria for the evaluation and assessment of security

– To encourage and perform research in the field of security

To develop verification and testing tools

To increase security awareness in both federal and private sector

 1985: Trusted Computer System Evaluation Criteria

(TCSEC) == Orange Book

CSCE 548 - Farkas 38

Orange Book

 Orange Book objectives

– Guidance of what security features to build into new products

– Provide measurement to evaluate security of systems

– Basis for specifying security requirements

 Security features and Assurances

 Trusted Computing Base (TCB) security components of the system: hardware, software, and firmware + reference monitor

CSCE 548 - Farkas 39

Orange Book

Supply

 Users: evaluation metrics to assess the reliability of the security system for protection of classified or sensitive information when

Commercial product

Internally developed system

 Developers/vendors: design guide showing security features to be included in commercial systems

 Designers: guide for the specification of security requirements

CSCE 548 - Farkas 40

Orange book

 Set of criteria and requirements

 Three main categories:

– Security policy – protection level offered by the system

– Accountability – of the users and user operations

– Assurance – of the reliability of the system

CSCE 548 - Farkas 41

Security Policy

 Concerns the definition of the policy regulation the access of users to information

– Discretionary Access Control

– Mandatory Access Control

– Labels: for objects and subjects

– Reuse of objects: basic storage elements must be cleaned before released to a new user

CSCE 548 - Farkas 42

Accountability

 Identification/authentication

 Audit

 Trusted path: no users are attempting to access thr system fraudulently

CSCE 548 - Farkas 43

Assurance

 Reliable hardware/software/firmware components that can be evaluated separately

 Operation reliability

 Development reliability

CSCE 548 - Farkas 44

Operation reliability

During system operation

– System architecture: TCB isolated from user processes, security kernel isolated from non-security critical portions of the TCB

– System integrity: correct operation (use diagnostic software)

Covert channel analysis

Trusted facility management: separation of duties

Trusted recovery: recover security features after TCB failures

CSCE 548 - Farkas 45

Development reliability

 System reliable during the development process.

Formal methods.

– System testing: security features tested and verified

– Design specification and verification: correct design and implementation wrt security policy. TCB formal specifications proved

– Configuration management: configuration of the system components and its documentation

– Trusted distribution: no unauthorized modifications

CSCE 548 - Farkas 46

Documentation

 Defined set of documents

 Minimal set:

Trusted facility manual

Security features user’s guide

Test documentation

Design documentation

Personnel info: Operators, Users, Developers,

Maintainers

CSCE 548 - Farkas 47

Orange Book Levels

Highest Security

– A1 Verified protection

– B3 Security Domains

– B2 Structured Protection

– B1 Labeled Security Protections

– C2 Controlled Access Protection

– C1 Discretionary Security Protection

– D Minimal Protection

No Security

CSCE 548 - Farkas 48

NCSC Rainbow Series

 Orange : Trusted Computer System

Evaluation Criteria

 Yellow : Guidance for applying the Orange

Book

 Red : Trusted Network Interpretation

 Lavender : Trusted Database Interpretation

CSCE 548 - Farkas 49

Evaluation Process

Preliminary technical review (PTR)

– Preliminary technical report: architecture potential for target rating

Vendor assistance phase (VAP)

– Review of the documentation needed for the evaluation process, e.g., security features user’s guide, trusted facility manual, design documentation, test plan. For B or higher, additional documentations are needed, e.g., covert channel analysis, formal model, etc.

Design analysis phase (DAP)

– Initial product assessment report (IPAR): 100-200 pages, detailed info about the hardware, software architecture, security relevant features, team assessments, etc.

Technical Review Board

Recommendation to the NCSC

CSCE 548 - Farkas 50

Evaluation Process

Formal evaluation phase (FEP)

Product Bulletin: formal and public announcement

Final Evaluation Report: information from IPAR and testing results, additional tests, review code (B2 and up), formal policy model, proof.

Recommends rating for the system

NCSC decides final rating

Rating maintenance phase (RAMP)

– Minor changes and revisions

Reevaluated

Rating maintenance plan

CSCE 548 - Farkas 51

European Criteria

German Information Security Agency: German Green

Book (1988)

British Department of Trade and Industry and Ministry of

Defense: several volumes of criteria

Canada, Australia, France: works on evaluation criteria

1991: Information Technology Security Evaluation Criteria

(ITSEC)

For European community

Decoupled features from assurance

Introduced new functionality requirement classes

Accommodated commercial security requirements

CSCE 548 - Farkas 52

Common Criteria

 January 1996: Common Criteria

– Joint work with Canada and Europe

Separates functionality from assurance

Nine classes of functionality: audit, communications, user data protection, identification and authentication, privacy, protection of trusted functions, resource utilization, establishing user sessions, and trusted path.

– Seven classes of assurance: configuration management, delivery and operation, development, guidance documents, life cycle support, tests, and vulnerability assessment.

CSCE 548 - Farkas 53

Common Criteria

 Evaluation Assurance Levels (EAL)

– EAL1: functionally tested

– EAL2: structurally tested

EAL3: methodologically tested and checked

EAL4: methodologically designed, tested and reviewed

EAL5: semi-formally designed and tested

EAL6: semi-formally verified and tested

EAL7: formally verified design and tested

CSCE 548 - Farkas 54

National Information Assurance

Partnership (NIAP)

1997: National Institute of Standards and

Technology (NIST), National Security Agency

(NSA), and Industry

Aims to improve the efficiency of evaluation

Transfer methodologies and techniques to private sector laboratories

Functions: developing tests, test methods, tools for evaluating and improving security products, developing protection profiles and associated tests, establish formal and international schema for CC

CSCE 548 - Farkas 55

Next Class

 Software Development Lifecycle

CSCE 548 - Farkas 56

Download