3_Pokines

advertisement
Developing Practical and Meaningful Cyber
Security Metrics for the Enterprise
Matthew Sweeney - SRC, Inc.
Benjamin Pokines – SRC, Inc.
1
© 2013 SRC
Goal of the Talk
State of enterprise cyber security metrics
• Current practices
• Gaps
Proposed metrics
• Description
• Formulation
• Experiments/results
Analysis and future directions of metrics
• “Ideal” metrics
• Applications
• Future work
PR 13-2701
2
Problem Description
Cyber security metrics often focus on compliance and
exposure to risk
• Number of attack vectors
• Duration of exposure
• etc.
Gaps exist in measuring security operations intelligence
• And in turn, business risk & cost
• Improved metrics and metrics reporting would improve cyber
security intelligence
Major issue is outcome, not just compliance
• E.g., down machines, increased threat exposure
PR 13-2701
3
Problem Description (cont.)
As per Verizon 2013 DBIR, cyber metrics need to
improve in order to detect attacks quickly and drive
business action
PR 13-2701
4
Current Cyber Security Practice
Host
Threats
Network
Threats
Log
Aggregation
& Reporting
Vulnerabilities
Sensors
PR 13-2701
Audit
Logs
Collect
& SIEM
React
5
Defining Metrics is Difficult
Granularity of metrics matters
Example: A basic metric indicates that 2% of hosts on a
network have ‘high’ number of vulnerabilities
• This may hold true across multiple days/weeks/months
• However, there is no guarantee that this applies to the same 2% of
hosts over time
Surface area that needs to be protected is small, but is
changing!
Day 1
Day 2
Day 3
Key
Entire Network
Host w/’normal’
vulnerabilities
Host w/high’
vulnerabilities
PR 13-2701
6
SRC Cyber Security Metrics
Goals and Objectives
Approach
Aspects of comprehensive cyber security metrics
Component breakdown
Results & Findings
• Characterization
• Trends
Limitations
Recommendations
PR 13-2701
7
Metrics Goals and Objectives
High level metrics that normalize across sensors to paint
a picture of overall health of information systems
• Near real time
Think credit score
• Amount of debt : Number of threats detected
• Active lines of credit : Number of assets on the network
• Credit score : Hackability Index score
Facilitate discovery and cyber intelligence as opposed to
strictly monitoring
• Idea is for metrics to develop information and show how things are
changing over time
Designed as an in-house experiment with the commercial
market in mind
• Specifically, management and executive level personnel
PR 13-2701
8
Hackability Index Goals
Goal
Description
Why?
Real-time adaptation
Sense and communicate
changes and risks before
trouble occurs
Move from defensive to
proactive operation
Create enterprise risk
understanding
Concrete, semantically rich risk
assessment representation
combining metrics and context
Systems and humans defending
networks need concrete
information to act
Detect, characterize, and
attribute behavior
Analyze data to find misuse,
Getting to the bottom of
anomalies, and patterns that fit threats requires analyzing them
threat actor plans
in the correct context
Scale to the entire enterprise
over time
Architect data storage and
analysis for 1 million+
machines over 6 years of data
PR 13-2701
Finding malicious patterns
requires much more than last
months data
9
Approach
Interviews with team at DHS ICE SOC, SRC IT
• Initial effort – develop ‘crib sheet’ of relevant events given triggers
from metrics
Met with internal IT team on multiple occasions
• Glean what information is important to them and what is already
available
Met with SRC executive team
• Determine what, at that level, is useful and actionable
Developed a list of over 60 metrics
Reduced ~20-25 metrics into 6 categories
PR 13-2701
10
What is Hackability?
Baseline and delta of observed cyber
security metrics
• e.g., attack surface; threats, reinfections,
persistent infections; outbound
traffic trends
Behavior models and patterns
• Internal Vulnerabilities, Infections, and Persistent
Threats
• External Communications and 3rd Party
Associations
• Internal and External Exposures
and Likelihood to Enable Threats
(Employee Behavior)
Open-source information
• New vulnerabilities with available exploits
• External Threat Environment and Attack Intel
PR 13-2701
11
Hackability Index I Operational Concept
Business
PR 13-2701
Hackability
12
Aspects of Comprehensive Metrics
Continuously updated
Contextually rich
• Internal cyber defense activity
• External attack activity and user behavior profile
• Connection to business operations and outcome
Deal in atoms of data over time
Can be consistently applied to different time periods of data
(one hour vs. one day vs. one month)
Bonus – metrics that can be fully automated from security
sensor logs
• Log types that are supported under this work are:
−
−
−
−
−
PR 13-2701
Netscreen (firewall)
Tipping point (IPS)
Secunia (assets/vulnerabilities)
Symantec (scheduled & manual A/V)
Cisco Ironport (web proxy)
13
Support of Critical Controls
Sans Top 20
• Critical Control 5: Malware Defenses
• Critical Control 14: Maintenance, Monitoring, and Analysis of Security Audit
Logs
NIST 800-53
Family
Class
Control
Security Assessment
and Authorization
Management
CA-7 CONTINUOUS MONITORING
Incident Response
Operational
IR-5 INCIDENT MONITORING
IR-6 INCIDENT REPORTING
IR-7 INCIDENT RESPONSE ASSISTANCE
Risk Assessment
Management
RA-3 RISK ASSESSMENT
System and
Information Integrity
Operational
SI-4 INFORMATION SYSTEM MONITORING
SI-5 SECURITY ALERTS, ADVISORIES, AND DIRECTIVES
PR 13-2701
14
Hackability Index
The Hackability Index is currently patent pending
Hackability
Index
Defense
Effectiveness
Index
Technical Debt
Index
Opportunity
Risk Index
New
Detections
Index
Proportion of
machines on
the network
that have been
compromised
to some
degree
Proportion of
total detection
activity relative
to number of
assets (as
compared with
recent past)
Severity of
current
detections
compared to
max possible
severity that
could be seen
Number of
current threats
that are ‘new’
(haven’t been
spotted on
network in
past N days)
PR 13-2701
Surface Area
Index
Number of
network
assets that
currently have
threat
detections
Length Of
Score History
Index
Proportion of
time that each
sensor has
been online
and reporting
15
Hackability Index I Scoring Components
Index
Component
Description
Meaning
Defense
Effectiveness
Level of reinfections on your network,
weighted by severity
A low score could suggest that your tools for
mitigating infections need improvement.
Length of
History
History of data available for index
computations for each sensor
A low score indicates the immaturity of sensor use
and ability to log its history
New Threats
Threats detected by your network sensors
not seen (recently).
A low score could suggest new or upcoming
infection trends.
Opportunity
Risk
Overall severity of threats on your network
A low score indicates that there are many high
severity threats, and these should be addressed.
Surface Area
Percentage of assets producing detections
by your sensors
A low score indicates there are many hosts with
activity for your sensors.
Technical Debt
Overall threat activity to key assets on your
network
A low score indicates there has been an increase
in threat activity relative to historical values
PR 13-2701
16
Opportunity Risk Index
Opportunity risk is designed to show how the severity of
current threat detections compares to the maximum
possible severity that could be seen
Calculation for each sensor, i;
π‘‚π‘π‘π‘…π‘–π‘ π‘˜π‘– =
π‘‘π‘œπ‘‘π‘Žπ‘™π‘†π‘’π‘£π‘’π‘Ÿπ‘–π‘‘π‘¦π‘–
∗ π‘€π‘’π‘–π‘”β„Žπ‘‘π‘–
π‘šπ‘Žπ‘₯π‘†π‘’π‘£π‘’π‘Ÿπ‘–π‘‘π‘¦π‘– ∗ π‘›π‘’π‘šπ·π‘’π‘‘π‘’π‘π‘‘π‘–π‘œπ‘›π‘ π‘–
And finally, combining all of the sensors together:
π‘‚π‘π‘π‘…π‘–π‘ π‘˜πΌπ‘›π‘‘π‘’π‘₯𝑃𝑑𝑠 = π‘‚π‘π‘π‘…π‘–π‘ π‘˜π‘Šπ‘’π‘–π‘”β„Žπ‘‘ −
PR 13-2701
𝑖 π‘‚π‘π‘π‘…π‘–π‘ π‘˜
𝑖 π‘€π‘’π‘–π‘”β„Žπ‘‘
∗ π‘‚π‘π‘π‘…π‘–π‘ π‘˜π‘Šπ‘’π‘–π‘”β„Žπ‘‘
17
Opportunity Risk Index Example
Firewall (weight = 1.0)
24 Threat Detections
Severity = 1 (9)
Severity = 3 (1)
Severity = 4 (14)
Max Severity is 5
π‘‚π‘π‘π‘…π‘–π‘ π‘˜ =
66
∗ 1.0
5 ∗ 24
A/V (weight = 2.0)
11 Threat Detections
Severity = 1 (1)
Severity = 3 (10)
Max Severity is 3
π‘‚π‘π‘π‘…π‘–π‘ π‘˜ =
π‘‚π‘π‘π‘…π‘–π‘ π‘˜ = 0.55 ∗ 1.0 = 𝟎. πŸ“πŸ“
31
∗ 2.0
3 ∗ 11
π‘‚π‘π‘π‘…π‘–π‘ π‘˜ = 0.94 ∗ 2.0 = 𝟏. πŸ–πŸ–
Firewall (weight = 1.0)
19 Threat Detections
Severity = 1 (8)
Severity = 2 (10)
Severity = 5 (1)
Max Severity is 5
π‘‚π‘π‘π‘…π‘–π‘ π‘˜ =
33
∗ 1.0
5 ∗ 19
π‘‚π‘π‘π‘…π‘–π‘ π‘˜ = 0.35 ∗ 1.0 = 𝟎. πŸ‘πŸ“
Opportunity Risk Index Score (weight = 10)
π‘‚π‘π‘π‘…π‘–π‘ π‘˜πΌπ‘›π‘‘π‘’π‘₯𝑃𝑑𝑠 = 10 −
0.55 + 1.88 + 0.35
∗ 10
1.0 + 2.0 + 1.0
π‘‚π‘π‘π‘…π‘–π‘ π‘˜πΌπ‘›π‘‘π‘’π‘₯𝑃𝑑𝑠 = 10 − 0.70 ∗ 10 = 3.0
PR 13-2701
18
Internal Observations and Conclusions
Spikes in Hackability Index score on weekends, as expected
• As network activity drops one would expect threats reported to follow suit
Executive-level summary helps to explore gaps in defense performance
Although the data is structured, there are often nuances between sensors
that require effort to resolve
Granularity matters – there is a story in user behavior over time
Data models expected patterns of threat and vuln scans
PR 13-2701
19
Deploying Hackibility
Applications
Industry could use as a standard
form of measurement
Executives could use as a tool to
determine network health vs. IT
expenditures
Applications in insurance industry,
mergers & acquisitions, takeovers,
etc.
PR 13-2701
Limitations
Near real time
Works only with what sensors
report
Vulnerability sensor data
Unknown unknowns
20
Future Directions
Real-time behavior analysis
Incorporating risk from external indicators
• Example could be malware in the wild
Patch and vulnerability sensor data
Predictive analytics
• Using known attack vectors as models to predict when an attack is
imminent or in-process
• Using external indicators combined with observed events to predict
a coming attack
PR 13-2701
21
Questions?
Matthew Sweeney
msweeney at srcinc dot com
Benjamin Pokines
bpokines at srcinc dot com
PR 13-2701
22
Download