OWASPAppSec2007Milan_OWASPTestingGuide2v1

advertisement
OWASP Testing Guide v2
Matteo Meucci
OWASP Testing Guide lead
OWASP Italy Chair
matteo.meucci@owasp.org
+393493102234
6th OWASP
AppSec
Conference
Milan - May 2007
Copyright © 2007 - The OWASP Foundation
Permission is granted to copy, distribute and/or modify this document under the
terms of the Creative Commons Attribution-ShareAlike 2.5 License. To view this
license, visit http://creativecommons.org/licenses/by-sa/2.5/
The OWASP Foundation
http://www.owasp.org/
Agenda:
 The new Testing Guide: goals and deliverables
 The OWASP Testing Framework
 The Testing Methodology: how to test
 Reporting: how to evaluate the risk and write a report
 How the Guide will be useful to the web security industry
 Q&A
6th OWASP AppSec Conference – Milan – May 2007
2
Introduction – Matteo Meucci




OWASP Testing Guide lead 2007 thanks to the AoC!
6+ years in Information Security focusing on Application
Security
OWASP Italy founder and Chair
Consultant at BT Global Services
6th OWASP AppSec Conference – Milan – May 2007
3
OWASP Projects
Tools
Testing
Guide
Honeycomb
Threat Agents
Business Impacts
Vulnerabilities
Business
Impact
Code Review
Guide
Vulnerability
System Impacts
Countermeasures
Asset
Countermeasure
Attacks
Attack
Building
Guide
6th OWASP AppSec Conference – Milan – May 2007
4
What Is the OWASP Testing Guide?
Free and open…
“It's impossible to underestimate the importance of
having this guide available in a completely free and
open way”– Jeff Williams (OWASP Chair)
6th OWASP AppSec Conference – Milan – May 2007
5
OWASP Testing Guide v2: Goals
 Review all the documentation on testing:
 July 14, 2004
 "OWASP Web Application Penetration Checklist", Version 1.1
 December 2004
 "The OWASP Testing Guide", Version 1.0
 Create a completely new project focused on Web Application Penetration
Testing
 Create a reference for application testing and describe the OWASP
methodology
 Our approach in writing this guide
 Open
 Collaborative
 Defined testing methodology
 Consistent
 Repeatable
 High quality
6th OWASP AppSec Conference – Milan – May 2007
6
OWASP Testing Guide v2: Action Plan
Start in Oct 2006:
 Collect all old docs
 Brainstorming for the Index and template
 Involve major world experts in this field:
* Vicente Aguilera
* Mauro Bregolin
* Tom Brennan
* Gary Burns
* Luca Carettoni
* Dan Cornell
* Mark Curphey
* Daniel Cuthbert
* Sebastien Deleersnyder
* Stephen DeVries
* Stefano Di Paola
* David Endler
* Giorgio Fedon
* Javier Fernández-Sanguino
* Glyn Geoghegan
* Stan Guzik
* Madhura Halasgikar
* Eoin Keary
* David Litchfield
* Andrea Lombardini
* Ralph M. Los
* Claudio Merloni
* Matteo Meucci
* Marco Morana
* Laura Nunez
* Gunter Ollmann
* Antonio Parata
* Yiannis Pavlosoglou
* Carlo Pelliccioni
* Harinath Pudipeddi
* Alberto Revelli
* Mark Roxberry
* Tom Ryan
* Anush Shetty
* Larry Shields
* Dafydd Studdard
* Andrew van der Stock
* Ariel Waissbein
* Jeff Williams
6th OWASP AppSec Conference – Milan – May 2007
7
OWASP Testing Guide v2: Action Plan (2)
 Nov 2006:
 Write articles using our Wiki model
 Review articles
 Dec 2006:
 Review all the Guide
 Write the Guide in doc format
 Jan 2007:
 OWASP Testing Guide Release Candidate 1: 272 pages, 48 tests
 Feedback and review
 Feb 2007:
 OWASP Testing Guide v2 officially released
SANS Top 20 2007 cites our guide in section "C1. Web Applications"
http://www.sans.org/top20/?ref=1697#c1
"Congratulations on version 2 of the OWASP Testing Guide! It is an impressive and
informative document that will greatly benefit the software development community".
Joe Jarzombek, the Deputy Director for Software Assurance at Department of
Homeland Security
6th OWASP AppSec Conference – Milan – May 2007
8
Testing Guide v2: Index
1. Frontispiece
2. Introduction
3. The OWASP Testing Framework
4. Web Application Penetration Testing
5. Writing Reports: value the real risk
Appendix A: Testing Tools
Appendix B: Suggested Reading
Appendix C: Fuzz Vectors
6th OWASP AppSec Conference – Milan – May 2007
9
The OWASP Testing Framework in SDLC
 Principles of Testing: comparing the state of something
against a set of criteria defined and complete.
 We want security testing to not be a black art
 SDLC phases:
 Define
 Design
 Develop
 Deploy
 Maintenance
Before SDLC
Define&Design
Development
Deploy&Maintenance
6th OWASP AppSec Conference – Milan – May 2007
10
Testing Techniques
Before SDLC
Define&Design
Development
Deploy&Maintenance
 Manual Inspections & Reviews
 Test to ensure that the appropriate policy and standards are in
place for the development team
Before SDLC
Define&Design
Development
Deploy&Maintenance
 Threat Modeling
 Security Requirements Review
 Design an Architecture review, how the application works
 Create and Review Threat Models: develop realistic threat
scenarios
6th OWASP AppSec Conference – Milan – May 2007
11
Testing Techniques
Before SDLC
Define&Design
Development
Deploy&Maintenance
 Code Review
Code Walkthroughs (high-level)
Code Reviews (White Box Testing): static code reviews validate
the code against a set of checklists (CIA Triad, OWASP Top10,..)
Before SDLC
Define&Design
Development
Deploy&Maintenance
 Penetration Testing
 Focus of this guide: Black Box Testing
 The process involves an active analysis of the application for any
weaknesses, technical flaws or vulnerabilities
6th OWASP AppSec Conference – Milan – May 2007
12
SDLC & OWASP Guidelines
OWASP
Framework
6th OWASP AppSec Conference – Milan – May 2007
13
Testing paragraph template
Brief Summary
Describe in "natural language" what we want to test. The target of this
section is non-technical people (e.g.: client executive)
Description of the Issue
Short Description of the Issue: Topic and Explanation
Black Box testing and example
How to test for vulnerabilities:
Result Expected:
...
Gray Box testing and example
How to test for vulnerabilities:
Result Expected:
...
References
Whitepapers
Tools
6th OWASP AppSec Conference – Milan – May 2007
14
Black Box vs. Gray Box
Black Box

Gray Box

The penetration tester does not have any information
about the structure of the application, its components
and internals
The penetration tester has partial information about the
application internals. E.g.: platform vendor, sessionID
generation algorithm
White box testing, defined as complete knowledge of the application internals,
is beyond the scope of the Testing Guide and is covered by the OWASP Code
Review Project
6th OWASP AppSec Conference – Milan – May 2007
15
Testing Model
We have split the set of tests in 8 sub-categories (for a
total amount of 48 controls):
 Information Gathering
 Business logic testing
 Authentication Testing
 Session Management Testing
 Data Validation Testing
 Denial of Service Testing
 Web Services Testing
 AJAX Testing
In the next slides we will look at a few examples of
tests/attacks and at some real-world cases ....
6th OWASP AppSec Conference – Milan – May 2007
16
Information Gathering
The first phase in security assessment is of course focused on
collecting all the information about a target application.
Using public tools it is possible to force the application to leak
information by sending messages that reveal the versions and
technologies used by the application
Available techniques include:
 Raw HTTP Connections (netcat)
 The good old tools: nmap, amap, ...
 Web Spiders
 Search engines (“Google Dorking”)
 SSL fingerprinting
 File extensions handling
 Backups and unreferenced files
6th OWASP AppSec Conference – Milan – May 2007
17
Business logic testing
In this phase, we look for flaws in the application business logic rather
than in the technical implementation. Areas of testing include:
Rules that express the business policy (such as channels, location, logistics,
prices, and products)
Workflows that are the ordered tasks of passing documents or data from one
participant (a person or a software system) to another
One of the most common results in this step of the analysis are flaws in the
order of actions that a user has to follow: an attacker could perform them in a
different order to get some sort of advantage
This step is the most difficult to perform with automated tools, as it
requires the penetration tester to perfectly understand the business
logic that is (or should be) implemented by the application
6th OWASP AppSec Conference – Milan – May 2007
18
Business logic testing
FlawedPhone was soon targeted by a fraud attack





The attacker bought a new FlawedPhone SIM card
The attacker immediately requested to transfer the SIM card to another
mobile carrier, which credits 0.05 € for each received SMS message
When the SIM card was “transferred” to the new provider, the attacker then
started sending thousands of emails to her FlawedPhone email account
The attacker had a 6-8 hours window before the email+SMS application had
its list updated and stopped delivering messages
By that time, the attacker had ~50-100 € in the card, and proceeded to sell
it on eBay
All FlawedPhone systems worked as expected, and there were no bugs
in the application code. Still, the logic was flawed.
6th OWASP AppSec Conference – Milan – May 2007
19
Authentication testing
Testing the authentication scheme means understanding how the application
checks for users' identity and using that information to circumvent that
mechanism and access the application without having the proper credentials
Tests include the following areas:
• Default or Guessable Accounts
• Brute-force
• Bypassing Authentication
• Directory Traversal / File Include
• Vulnerable “Remember Password” and Password Reset
• Logout and Browser Cache Management
6th OWASP AppSec Conference – Milan – May 2007
20
Session management testing
Session management is a critical part of a security test, as every application
has to deal with the fact that HTTP is by its nature a stateless protocol.
Session Management broadly covers all controls on a user from
authentication to leaving the application
Tests include the following areas:
 Analysis of the session management scheme
 Cookie and session token manipulation
 Exposed session variables
 Cross Site Request Forgery
 HTTP Exploiting
6th OWASP AppSec Conference – Milan – May 2007
21
Example: Cross Site Request Forgery
Test if it is possible to force a user to submit an undesirable command to the
application he/she is currently logged into
 Also known as “Session Riding”
 A quite old type of attack, whose impact has always been
underestimated
 It relies on the fact that browsers automatically send information used
to identify a specific session
 Applications that allow a user to perform some action without requiring
some unpredictable parameter are likely to be vulnerable
 ...That means a lot of applications!
 All it takes is to trick the victim into following a link (e.g.: by visiting an
attacker-controlled site) while he/she is logged into the application
6th OWASP AppSec Conference – Milan – May 2007
22
Example: Cross Site Request Forgery (cont.)
 trade.com is an online trading company
 trade.com uses an “über-paranoid triple-factor”™ authentication scheme,
but does not want to bother users with confirmations, since traders need
to act fast!
 Tester finds that a simple GET as follow:
https://trade.com/transfer?eu=90000&to=1234
Permits to execute a transaction
<html>
<title>I am a very evil HTML page... visit me ! :)</title>
<body>
..
<img src=”https://trade.com/transfer?eu=90000&to=1234” width=”0” height=”0”>
...
</body>
</html>
The link triggers
a fund transfer
The image is
not visible
6th OWASP AppSec Conference – Milan – May 2007
23
Data validation testing
In this phase we test that all input is properly sanitized before being
processed by the application, in order to avoid several classes of
attacks
Cross site scripting
Test that the application filters JavaScript code that might be executed by the
victim in order to steal his/her cookier
HTTP Methods and XST
Test that the remote web server does not allow the TRACE HTTP method
SQL Injection
Test that the application properly filters SQL code embedded in the user input
Other attacks based of faulty input validation...
 LDAP/XML/SMTP/OS injection
 Buffer overflows
6th OWASP AppSec Conference – Milan – May 2007
24
Denial of Service Testing
DoS are types of vulnerabilities within applications that can allow a malicious
user to make certain functionality or sometimes the entire website unavailable.
These problems are caused by bugs in the application, often resulting from
malicious or unexpected user input
 Locking Customer Accounts
 User Specified Object Allocation
 User Input as a Loop Counter
 Writing User Provided Data to Disk
 Failure to Release Resources
 Storing too Much Data in Session
Usually not performed in production environments
6th OWASP AppSec Conference – Milan – May 2007
25
Web Services Testing
The vulnerabilities are similar to other “classical” vulnerabilities such as SQL
injection, information disclosure and leakage etc but web services also have
unique XML/parser related vulnerabilities.
WebScarab (available for free at www.owasp.org) provides a plug-in
specifically targeted to Web Services. It can be used to craft SOAP messages
that contains malicious elements in order to test how the remote system
validates input
6th OWASP AppSec Conference – Milan – May 2007
26
Web Services Testing
 XML Structural Testing
In this example, we see a snippet of XML code that violates the hierarchical
structure of this language. A Web Service must be able to handle this kind
of exception in a secure way
<?xml version="1.0" encoding="ISO-8859-1"?>
<note id="666">
<to>OWASP
<from>EOIN</from>
<heading>I am Malformed </to>
</heading>
<body>Example of XML Structural Test</body>
</note>
6th OWASP AppSec Conference – Milan – May 2007
27
Web Services Testing (cont.)
XML Large payload
Another possible attack consists of sending to a Web Service a very large
payload in an XML message. Such a message might deplete the resource of
a DOM parser
<Envelope>
<Header>
<wsse:Security>
<Hehehe>I am a Large String (1MB)</Hehehe>
<Hehehe>I am a Large String (1MB)</Hehehe>
<Hehehe>I am a Large String (1MB)</Hehehe>…
<Signature>…</Signature>
</wsse:Security>
</Header>
<Body>
<BuyCopy><ISBN>0098666891726</ISBN></BuyCopy>
</Body></Envelope>
6th OWASP AppSec Conference – Milan – May 2007
28
Web Services Testing (cont.)
Naughty SOAP attachments
Binary files, including executables and document types that can contain
malware, can be posted using a web service in several ways
POST /Service/Service.asmx HTTP/1.1
Host: somehost
Content-Type: text/xml; charset=utf-8
Content-Length: length
SOAPAction: http://somehost/service/UploadFile
<?xml version="1.0" encoding="utf-8"?>
<soap:Envelope xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:xsd="http://www.w3.org/2001/XMLSchema"
xmlns:soap="http://schemas.xmlsoap.org/soap/envelope/">
<soap:Body>
<UploadFile xmlns="http://somehost/service">
<filename>eicar.pdf</filename>
<type>pdf</type>
<chunk>X5O!P%@AP[4\PZX54(P^)7CC)7}$EICAR-STANDARD-ANTIVIRUS-TEST-FILE!$H+H*</chunk>
<first>true</first>
</UploadFile>
</soap:Body>
</soap:Envelope>
6th OWASP AppSec Conference – Milan – May 2007
29
AJAX Testing
AJAX (Asynchronous JavaScript and XML)
is a
web development technique used to create
more interactive web applications.
XMLHttpRequest object and JavaScript to
make asynchronous requests for all
communication with the server-side
application.
Main security issues:
 AJAX applications have a greater attack
surface because a big share of the
application logic is moved on the client
side
 AJAX programmers seldom keep an eye
on what is executed by the client and
what is executed by the server
 Exposed internal functions of the
application
 Client access to third-party resources with
no built-in security and encoding
mechanisms
 Failure to protect authentication
information and sessions
 AJAX Bridging
6th OWASP AppSec Conference – Milan – May 2007
30
AJAX Testing
While in traditional web applications it is very easy to enumerate the
points of interaction between clients and servers, when testing AJAX
pages things get a little bit more complicated, as server-side AJAX
endpoints are not as easy or consistent to discover
To enumerate endpoints, two approaches must be combined:
 Look through HTML and Javascript (e.g: look for XmlHttpRequest
objects)
 Use a proxy to monitor traffic
 Tools: OWASP Sprajax or Firebug add-on for Firefox
Then you can test it as described before (SQL Inj, etc..)
...and don't forget AJAX potential in prototype hijacking and
resident XSS !
6th OWASP AppSec Conference – Milan – May 2007
31
AJAX Testing (cont.)
With firebug it is possible
to efficiently inspect AJAX
apps
6th OWASP AppSec Conference – Milan – May 2007
32
Testing Report: model
The OWASP Risk Rating Methodology
 Estimate the severity of all of these risks to your business
 This is not universal risk rating system: vulnerability that is critical to one
organization may not be very important to another
Simple approach to be tailored for every case
 standard risk model: Risk = Likelihood * Impact
Step 1: identifying a risk
You'll need to gather information about:
 the
 the
 the
 the
vulnerability involved
threat agent involved
attack we are using
impact of a successful exploit on your business.
6th OWASP AppSec Conference – Milan – May 2007
33
Testing Report: likelihood
Step 2: factors for estimating likelihood
Generally, identifying whether the likelihood is low, medium, or high
is sufficient.
Threat Agent Factors:
 Skill level (0-9)
 Motive (0-9)
 Opportunity (0-9)
 Size (0-9)
Vulnerability Factors:
 Ease of discovery (0-9)
 Ease of exploit (0-9)
 Awareness (0-9)
 Intrusion detection (0-9)
6th OWASP AppSec Conference – Milan – May 2007
34
Testing Report: impact
Step 3: factors for estimating impact
Technical impact:
 Loss
 Loss
 Loss
 Loss
of
of
of
of
confidentiality (0-9)
integrity (0-9)
availability (0-9)
accountability (0-9)
Business impact:
 Financial damage (0-9)
 Reputation damage (0-9)
 Non-compliance (0-9)
 Privacy violation (0-9)
6th OWASP AppSec Conference – Milan – May 2007
35
Testing Report: value the risk
Step 4: determining the severity of the risk
 In the example above, the likelihood is MEDIUM, and the technical impact is
HIGH, so from technical the overall severity is HIGH. But business impact
is actually LOW, so the overall severity is best described as LOW as well.
6th OWASP AppSec Conference – Milan – May 2007
36
Writing Report
I. Executive Summary
II. Technical Management Overview
III Assessment Findings
IV Toolbox
6th OWASP AppSec Conference – Milan – May 2007
37
How the Guide will help the security
industry

A structured approach to the testing activities

A checklist to be followed

A learning and training tool

A tool to understand web vulnerabilities and their impact
Pen-testers
Clients

A way to check the quality of the penetration tests they
buy
More generally, the Guide aims to provide a pen-testing standard that creates
a 'common ground' between the pen-testing industry and its client.
This will raise the overall quality and understanding of this kind of activity and
therefore the general level of security in our infrastructures
6th OWASP AppSec Conference – Milan – May 2007
38
What’s next
OWASP Testing Guide next steps:
 Continuously improve the Testing Guide: it’s a live document!
 Start a new project: contribute to the new version?
 Improve the client side testing (see next great talk from Stefano
di Paola)
 Translate it: the Guide has just been translated in Spanish,
thanks to Daniel P.F.!
 Thanks to Alberto Revelli for producing some slides we
discussed at EuSecWest07
6th OWASP AppSec Conference – Milan – May 2007
39
Thank you!
http://www.owasp.org
http://www.owasp.org/OWASP_Testing_Project
matteo.meucci@owasp.org
6th OWASP AppSec Conference – Milan – May 2007
40
Download