05 Vulnerability assessment

advertisement
DIGIT TestCentre
Vulnerability assessment service
Gabriel BABIANO
DIGIT.A.3
29/11/2012
I.T.
Agenda
• Service presentation
• Lessons learned
2
DIGIT TestCentre
Organizational location:
Physical location:
Service manager:
DIGIT.A.3
DRB D3 (LUX)
Gabriel BABIANO
Performance testing service since 2002
(currently 6 testers)
Vulnerability assessment service since 2011
(currently 3 testers)
3
DIGIT TestCentre – clients and figures
•Clients
•European Commission (including Executive Agencies and Services)
•Other European institutions under agreement (e.g. Court of Justice of
the European Union)
Breakdown per DGs
•Around 50 VTs per year
4
Grounds for vulnerability assessment
Motivation:
•Legal constraints
•Reputation
•Data stolen
•Continuity of the service
75% cyber-attacks
directed to web
application layer
(Gartner)
Network security alone does not protect web apps!!!
5
Tests in Information Systems life-cycle
6
Cost versus life-cycle stage
VT
Secure coding guidelines
7
"Finding and fixing
a software
problem after
delivery is often
100 times more
expensive than
finding and fixing
it during the
design and
requirements
phase"
(Barry Boehm)
DIGIT TC Vulnerability service deliverables
•Vulnerability assessment reports (per test/iteration)
•
•
•
•
Filtered potential vulnerabilities (no false positive…)
Classification on criticality and prioritization
Potential remediation
Evolution from previous iterations
•Secure coding guidelines
•
•
•
•
•
Best practices in secure coding
Recommended languages (HTML, JAVA, ColdFusion)
Aligned to threats evolution
Both for developers and operational managers
1st draft release due for 01/2013
8
DIGIT VT service tests
•Black Box Vulnerability Test (dynamic analysis)
•
•
•
•
•
Need a working application target (closest to PROD)
No access to source code required
Not specific to coding language(s)
Automatic tools + manual testing to supplement the tools
Complement to Penetration Testing and WBVT
• White Box Vulnerability Tests (static analysis)
•
•
•
•
•
Access to buildable source code
Automatic tools + manual revision to avoid false positives
All recommended languages are supported (Java, CF…)
No absolute need for application target but it helps a lot
Detects more vulnerabilities than black box
9
DIGIT TestCentre service procedure workflow
Several iterations
are normally
required
10
DIGIT TC Vulnerability service tools
•Static code analysis (SAST)
•Automatic tools
•Manual code review:
•Eclipse
•Dynamic program analysis (DAST)
•Automatic tools
•Manual tools:
•Firefox and plugins:
•Tamper Data
•Database tools
11
Tools evaluation - methodology
12
Tools evaluation – criteria
13
Tools evaluation – critical metrics
Correctness of the results
Accurate
Minimum false positive
Minimum inconclusive
Minimum duplicates
Completeness of the results
% detected
% missed
False negatives
Misnamed
Performance
Scan duration
14
Tools lists
•Static code analysis (SAST)
http://en.wikipedia.org/wiki/List_of_tools_for_static_code_analysis
https://www.owasp.org/index.php/Source_Code_Analysis_Tools
•Dynamic program analysis (DAST)
http://en.wikipedia.org/wiki/Dynamic_program_analysis
•Open source DAST tools:
•
•
•
•
•
•
WebScarab
Nikto / Wikto
Open Web Application Security Project (OWASP)
Google ratproxy and skipfish
W3af
Websecurify
15
Costs per test
In-house service:
Assumption: complete VTs
(WB & BB) takes 10 working
days in average (15 tests per
tester per year)
Strong investment in
licenses the first year
Costs are similar after the
th
4 year
Security skilled tester with
an "industrialized" procedure
required
Outsourced service:
No requires investment
Less flexible for the
development?
Quality?
Iterations?
16
Engineering for attacks
17
Vulnerability risk areas
Security
controls
Security
functions
18
OWASP Top Ten (2010 Edition)
http://www.owasp.org/index.php/Top_10
19
20
21
2011 CWE Top 25 Most Dangerous Software Errors
22
http://cwe.mitre.org/top25/
Comparison OWASP Top Ten 2010 –
CWE Top 25 2011
http://cwe.mitre.org/top25/
23
DIGIT TestCentre
Priorities are
adapted for every
application
Score = Risk * Impact
24
Vulnerability assessment
• Assess and secure all parts individually
• The idea is to force an attacker to penetrate several
defence layers
• As a general rule, data stored in databases are
considered as "untrusted"
"In God we trust,
for the rest, we test"
25
Vulnerability remediation priorities
Recommendations for remediation are founded in the
report
Cover high priority first. Then others when
“affordable”
Begin with risky vulnerabilities that are easy to
remediate
26
Vulnerabilities type occurrence in the 1st iteration (%)
100%
90%
80%
70%
60%
50%
40%
30%
20%
10%
0%
http://cwe.mitre.org/top25/
http://projects.webappsec.org/w/page/13246989/Web%20Application%20Security%20Statistics
27
Improvements in Design and Coding stages
Iteration
Vulnerability group
Cross-Site Scripting
Injection
Insecure Transmission of
credentials/tokens
Password Management
Cookie Security
Path Manipulation
Weak authentication
Open redirect
Logging of credentials
Cross-Site Request Forgery
Header Manipulation
Weak cryptography
File Upload
Forced Browsing
Log Forging
Information disclosure
1
43
23
2
14
6
10
13
9
3
4
5
2
16
15
14
8
7
6
4
3
6
7
2
2
28
1
4
3
2
3
2
1
3
3
2
1
4
2
5
1
1
6
1
2
Flaws can appear
in future iterations
1
1
1
1
1
1
1
1
security increases
1 in every iteration
1
1
2
Threats to the VT success
•
•
•
•
•
Tested source code not the same as PROD
Testing environment differs from PROD
Vulnerability testing tools can’t cover all automatically
Hacking techniques faster than service coverage
If necessary, penetration tests are conducted by a 3rd party
100% risks & vulnerability-free cannot be guaranteed
and security is not only a secure source code…
29
Some references
• Open Web Application Security Project (OWASP): www.owasp.org
• Web Application Security Consortium (WASC): www.webappsec.org
• Common Vulnerability Scoring System (CWSS): http://www.first.org/cvss/
• Common Weakness Enumeration (CWE): http://cwe.mitre.org
• Common Attack Pattern Enumeration and Classification (CAPEC):
http://capec.mitre.org/
• SANS Institute: www.sans.org
30
Questions?
31
Thank you!
32
Download