Building An In-House
Application Security
Assessment Team
OWASP
Keith Turpin
The Boeing Company
keith.n.turpin@boeing.com
206-683-9667
11/13/2009
Copyright © The OWASP Foundation
Permission is granted to copy, distribute and/or modify this document
under the terms of the OWASP License.
Copyright © 2009 Boeing. All rights reserved.
The OWASP Foundation
http://www.owasp.org
Contracted Services Considerations
 Some Advantages:
Highly skilled
Established tools,
processes and standards
Unbiased
Available as needed
 Some Disadvantages:
Expensive, especially for an extended engagement
Less control and flexibility
Not familiar with company processes and culture
Rotating staff
OWASP
Copyright © 2009 Boeing. All rights reserved.
2
Planning
 Considerations for establishing an internal
team:
 Time to staff and train the team
 Overlap of external and internal teams
 Development of processes and standards
 Acquiring necessary tools
OWASP
Copyright © 2009 Boeing. All rights reserved.
3
Service Model
 Define the services your team will provide. This
will be greatly influenced by:
 The team’s size and skills
 The number of applications
you have to support
 The tools available
 The level of executive support
 The funding model
 Who pays for your services
 The team’s role
 Development support, pre-deployment testing or post
deployment auditing and pen testing
OWASP
Copyright © 2009 Boeing. All rights reserved.
4
Staffing the Team
 Decide how to staff your team and what skills
you need. Possible candidates include:
 Experienced Application Testers
 This is ideal from a skills standpoint, but people in this
category may be harder to find, cost more, may not be
familiar with your company and or fit its culture.
 Experienced Developers
 Developers will have a good understanding of the
technologies, but may not understand security principles.
Their focus is on what an application is intended to do, not
what it can be made to do.
OWASP
Copyright © 2009 Boeing. All rights reserved.
5
Staffing the Team - continued
 Other IT Security Professionals
 They have a good understanding of security principles, but
may lack specific technical skills. However, some skills may
provide a useful overlap, like experienced OS or network
testers.
 Service and Project Managers
 Building a new team, defining processes and standards,
managing work flow and handling customer relations requires
a set of skills as important, but distinct, from technical testing
skills.
OWASP
Copyright © 2009 Boeing. All rights reserved.
6
Selecting Tools
 There are a lot of options when it comes to
tools. What you choose depends on the
services you want to provide, your team’s skills
and your budget.
Commercial vs. Free or Low Cost Tools
 Commercial tools scale to support enterprise use, utilize a
higher degree of automation and come with product support.
They also come with a big price tag.
 Open source and low cost tools allow for more customization,
are free or inexpensive, usually have a supportive user
community, but often require a higher degree of user
knowledge and skill.
OWASP
Copyright © 2009 Boeing. All rights reserved.
7
Selecting Tools - continued
 Types of Tools
Vulnerability Scanners
 Commercial examples include IBM AppScan, HP WebInspect
and Cenzic Hailstorm
Source Code Analysis Tools
 There are commercial options like Fortify or open source tools
like the OWASP Yasca Project
Client Side Web Proxies
 Options include WebScarab, Burp Suite and Charles Proxy
Other Tools
 These include password crackers, hex editors, text extractors,
browser plug-ins, integrated development environments,
network mapping, network traffic analysis and exploitation
tools
OWASP
Copyright © 2009 Boeing. All rights reserved.
8
What to Assess
 Measuring an application’s risk:
The Types of Users
 Privileged Users, employees, suppliers,
customers or the general public
The Sensitivity of the Data
 Intellectual Property, PII or other regulatory requirements
Availability and Integrity Requirements
 The impact to the business if compromised
Technology and Environmental
Consideration
 What technologies are used, where is it deployed, …
OWASP
Copyright © 2009 Boeing. All rights reserved.
9
Gather Necessary Information
 Before starting an assessment you will need to
gather important information:




Application contacts
Server contacts
The process for getting accounts
A description of what the application does





The description or diagram of the system architecture
The types of interfaces (human and system to system)
The development platform
The URL(s) or application installation files
Is the source code available?
 Is this an in house, open source or commercial application?
 Are there schedule constraints or test windows?
OWASP
Copyright © 2009 Boeing. All rights reserved.
10
Assessment Planning Meeting
 Meet with the application development and
support teams:
Get a demonstration of the application
Review the information gathered to support the
assessment
Discuss the testing process and ground rules
 No changes to the code during testing
 Backups of the application servers and databases
 How to address system crashes during testing
 Database corruption issues
 Emails generated by the application
OWASP
Copyright © 2009 Boeing. All rights reserved.
11
Testing Notifications
 You should have a process to notify affected
parties before the actual testing begins.
Key system contacts
Intrusion detection teams
Other assessors
 Information to include in the notification:
Source IP addresses
Target IP addresses, URL, system name
Testing schedule
Assessment team contacts
Copyright © 2009 Boeing. All rights reserved.
OWASP
12
Conducting the Assessment
 If you are using automated scanning tools,
beware of false positives and negatives.
Pattern recognition has limitations
Combine various testing methods
 Automated scanning
 Code review
 Manual testing
Learn what your tools do and do not do well
Validate every finding
Keep detailed notes
OWASP
Copyright © 2009 Boeing. All rights reserved.
13
Establish Standards
 Assessments performed by two different people
or the same person over time, may result in the
same finding being presented very differently
This may result in inconsistent descriptions of the
vulnerability or different recommendations for
remediation
Without standard findings you may also finding it
difficult to produce meaningful metrics about
discovered vulnerabilities
OWASP
Copyright © 2009 Boeing. All rights reserved.
14
Standard Findings
 Opinions about how to standardize
software vulnerabilities are like noses,
everyone has one.
At Boeing we have categorized vulnerabilities
into approximately 70 standard findings like:
 SQL Injection
 Path Traversal
 Session Fixation
 Excessive Authentication Attempts
 Forced Browsing
 System Information Leakage
OWASP
Copyright © 2009 Boeing. All rights reserved.
15
Data Elements for Standard Findings
 Each finding is made up of the following data
elements:
Name
Control Classification
Severity………..…Likelihood + Impact
Company Policy References
Industry References
Summary Description
Impact Statement
Detailed Description
Recommendation
OWASP
Copyright © 2009 Boeing. All rights reserved.
16
Control Classifications
 We group individual vulnerabilities into control
classifications. This helps us determine how
effective we are at implementing control types.
 Our classifications:
Input and output controls
Authentication and password management
Authorization and access management
Sensitive information storage or transmission
System configuration and management
General coding errors
OWASP
Copyright © 2009 Boeing. All rights reserved.
17
Reporting Findings
 Developing a standardized reporting template
will allow you to deliver a consistent, branded
message. Key elements of our reports include:
Cover Page
Executive Summary
Findings Summary
Detailed Findings
Conclusion
Appendixes
Attachments – (optional)
OWASP
Copyright © 2009 Boeing. All rights reserved.
18
Reporting Findings - Continued
Cover Page:
 Provides information necessary to identify the
assessment, what was assessed and who the key people
involved were.
Executive Summary:
 A quick summary of the assessment which may include:
–
–
–
–
Description of the assessment
Identification of application being assessed
A list of findings by severity
Testing techniques
Findings Summary:
 A brief description of each finding mapped to control
classifications.
OWASP
Copyright © 2009 Boeing. All rights reserved.
19
Reporting Findings - Continued
Detailed Findings:
 Includes all the standard findings data elements
Conclusion:
 Summary of assessment results, discussion of next steps and
links to additional resources
Appendixes:
 Information on how severity ratings are determined, description
of control classifications
Attachments:
 Typically these are raw scan files or other supplemental data
files
OWASP
Copyright © 2009 Boeing. All rights reserved.
20
Managing Corrective Actions
 Once a report is issued you need a closed loop
process to ensure serious issues are addressed.
Considerations include:
Tracking Findings:
 Critical and high findings should be tracked to resolution
 Medium findings are less straight forward
 Low or informational findings may not be value added
Customer Responses to Findings:
 Implement a technical fix to address the finding
 Implement a process fix to address the finding
 The business formally accepts the risk of not remediating
OWASP
Copyright © 2009 Boeing. All rights reserved.
21
When to Re-Evaluate An Application
 Depending on the number of applications you
support and the frequency with which they
change you may need to establish re-evaluation
guidelines. Some criteria to consider include:
 Fixes to previously accepted risk
 User population changes
 Data sensitivity changes
 Business’s dependency on the application has increased
 Authentication mechanism has changed
 Authorization management code has changed
 New functionality or user interfaces are being added
OWASP
Copyright © 2009 Boeing. All rights reserved.
22
Application Assessment
Process Flow Version
START
Option 2
Targeted Assessment
START
Option 1
Requested Assessment
Assessment
Request for Service
Form
Targeted Assessment
Asset selected for
assessment
Requester completes
online RFS
Pre-Assessment
Questionnaire
Asset owner is notified of
assessment initiation
Assessment
team conducts
Intake Review
Go To
Technical
Assessment?
YES
NO
Customer completes the PreAssessment Questionnaire
Assessment planning
meeting takes place
to better understand
the target and scope
the assessment.
Assessor is
assigned
Assessment
Complete or Closed
Letter
NO
Continue
Assessment?
End of Process
YES
Findings worked
to resolution
NO
Full Assessment
Findings Report
END
Corrective Action
Process
YES
Were there
findings?
Conduct
Assessment
OWASP
Copyright © 2009 Boeing. All rights reserved.
23
Conclusion
 Building an assessment team from the ground
up takes:
Executive support
A lot of planning
Staffing
The right tools
Training
Standards
Supporting processes
OWASP
Copyright © 2009 Boeing. All rights reserved.
24
The End
Questions?
OWASP
Copyright © 2009 Boeing. All rights reserved.
25