Microsoft Security Development Lifecycle (SDL)
Optimization Model
Implementer Resource Guide: Basic to Standardized
Published: November 2008
Abstract
The Microsoft® Security Development Lifecycle (SDL) Optimization Model is designed to facilitate gradual,
consistent, and cost-effective implementation of the SDL by development organizations outside of Microsoft. The
model helps those responsible for integrating security and privacy in their organization’s software development
lifecycle to assess their current state and to gradually move their organizations towards the adoption of the proven
Microsoft program for producing more secure software. The SDL Optimization Model enables development
managers and IT policy makers to assess the state of the security in development. They can then create a vision
and road map for reducing customer risk by creating more secure and reliable software in a cost-effective,
consistent, and gradual manner. Although achieving security assurance requires long-term commitment, this guide
outlines a plan for attaining measureable process improvements, quickly, with realistic budgets and resources.
This is the third of five resource guides. It explains the key practices for organizations beginning at the Basic level,
identified as those with few or undefined software development security practices. This guide introduces a selfassessment checklist of relevant capabilities and advice for conducting and managing the practices to achieve
these capabilities, and it provides links to relevant resources where additional content can be found. You can use
the information contained in this guide to help you move from the Basic level to the Standardized level. For a full
description of the model, concepts, capabilities, and maturity levels, please see the first guide in this series,
Microsoft Security Development Lifecycle (SDL) Optimization Model: Introduction to the Optimization Model.
For the latest information, more detailed descriptions, and the business benefits of the Microsoft Security
Development Lifecycle, go to http://www.microsoft.com/SDL.
Microsoft makes no warranties, express, implied, or statutory as to the information in this document or
information referenced or linked to by this document.
The information contained in this document represents the current view of Microsoft Corporation on the issues discussed as of the
date of publication. Because Microsoft must respond to changing market conditions, it should not be interpreted to be a commitment on
the part of Microsoft, and Microsoft cannot guarantee the accuracy of any information presented after the date of publication.
This document is for informational purposes only. MICROSOFT MAKES NO WARRANTIES, EXPRESS, IMPLIED, OR STATUTORY, AS
TO THE INFORMATION IN THIS DOCUMENT OR INFORMATION REFERENCED OR LINKED TO BY THIS DOCUMENT.
Complying with all applicable copyright laws is the responsibility of the user. Without limiting the rights under copyright, no part of
this document may be reproduced, stored in or introduced into a retrieval system, or transmitted in any form or by any means
(electronic, mechanical, photocopying, recording, or otherwise), or for any purpose, without the express written permission of
Microsoft Corporation.
Microsoft may have patents, patent applications, trademarks, copyrights, or other intellectual property rights covering subject
matter in this document. Except as expressly provided in any written license agreement from Microsoft, the furnishing of this
document does not give you any license to these patents, trademarks, copyrights, or other intellectual property.
© 2008 Microsoft Corporation. All rights reserved. This work is licensed under the Creative Commons Attribution-Non-Commercial
License. To view a copy of this license, visit http://creativecommons.org/licenses/by-nc/2.5/ or send a letter to Creative Commons,
543 Howard Street, 5th Floor, San Francisco, California, 94105, USA.
Microsoft, Microsoft Office Word, InfoPath, Visual Studio, Win32, Visual C#, Visual C++, SQL Server, ActiveX, and Windows are
either registered trademarks or trademarks of Microsoft Corporation in the United States and/or other countries.
All other trademarks are property of their respective owners.
Contents
Resource Guide Overview .............................................................................................................................................1
Audience ....................................................................................................................................................................1
SDL Optimization Levels ............................................................................................................................................1
Preparing to Implement SDL Requirements ..................................................................................................................3
Phased Approach .......................................................................................................................................................3
Implementation services .......................................................................................................................................3
Implementer Guide—Basic to Standardized .................................................................................................................4
Capability Area: Training, Policy, and Organizational Capabilities ................................................................................4
Introduction ...............................................................................................................................................................4
Capability: Training ....................................................................................................................................................4
Capability: Bug Tracking.............................................................................................................................................6
Capability Area: Requirements and Design ...................................................................................................................8
Introduction ...............................................................................................................................................................8
Capability: Risk Assessment .......................................................................................................................................8
Capability: Quality Gates .........................................................................................................................................11
Capability: Threat Modeling ....................................................................................................................................12
Capability Area: Implementation.................................................................................................................................14
Introduction .............................................................................................................................................................14
Capability: Secure Coding Policies ...........................................................................................................................14
Capability: Cross-Site Scripting (XSS) and SQL Injection Defenses ..........................................................................16
Capability Area: Verification ........................................................................................................................................17
Introduction .............................................................................................................................................................17
Capability: Dynamic Analysis and Application Scanning (Web Applications) ..........................................................17
Capability: Fuzzing ...................................................................................................................................................19
Capability: Penetration Testing ...............................................................................................................................20
Capability Area: Release and Response .......................................................................................................................21
Introduction .............................................................................................................................................................21
Capability: Final Security Review .............................................................................................................................21
Capability: Project Archiving ....................................................................................................................................23
Capability: Response Planning and Execution .........................................................................................................24
Resource Guide Overview
Audience
This document is designed for development managers and IT decision makers who are responsible for
planning, deploying, and governing security and privacy measures in software development and who want
to implement the practices and concepts of the Microsoft® Security Development Lifecycle (SDL).
SDL Optimization Levels
The SDL Optimization Model
defines four optimization, or
maturity, levels (Basic,
Standardized, Advanced, and
Dynamic) for each of the
capability areas described on the
next page. This guide is
concerned with helping
organizations move from the Basic level to the Standardized level. The characteristics of these two
optimization levels are described in the following table.
1
2
Preparing to Implement SDL Requirements
The Standardized level is an important step on the road to the SDL. At the Standardized level, security
practices and standards are beginning to be introduced into the development lifecycle. Organizations at
this level are able to assess the security and privacy risk of new projects and to select the best candidates
for implementing security and privacy practices into the development lifecycle. The organization has
realized the value of the SDL and has made the decision to embark on the path of greater adoption.
Security and privacy practices are only applied to a few pilot projects. Much of the effort is spent at later
phases of the lifecycle and in security response, where improvements come at a greater cost than those
incurred with the more integrated and proactive practices at the higher optimization levels.
Phased Approach
Microsoft recommends a phased approach to meeting the requirements in each of the SDL capability
areas. The four phases are shown in the following illustration.
In the Assess phase, you determine the current capabilities and resources within your organization.
In the Identify phase, you determine what you need to accomplish and which capabilities you want to
incorporate.
In the Evaluate and Plan phase, you determine what you need to do to implement the capabilities
outlined in the Identify phase.
In the Deploy phase, you execute the plan that you built in the previous phase.
Implementation services
Implementation services for the projects outlined in this document are provided by Microsoft partners
and Microsoft Services. For assistance in implementing the SDL optimization improvements highlighted in
the SDL Optimization Model Implementer Resource Guides, please refer to the SDL Pro Network page on
the SDL Web site, or visit the Microsoft Services Web site.
3
Implementer Guide—Basic to Standardized
The following section provides detailed information and guidance on fulfilling the requirements of each of
the five capability areas in the Standardized level. The checkpoints in each capability area follow the
requirements discussed in the SDL Optimization Model: Self-Assessment Guide for the Standardized level.
Start with the Self-Assessment Guide, and then use the detailed implementation guidance that follows to
increase maturity in the areas necessary for your organization.
Capability Area: Training, Policy, and Organizational
Capabilities
Introduction
Training, Policy, and Organizational Capabilities is an SDL optimization capability area and the foundation
for implementing many capabilities in the SDL Optimization Model. Ongoing Training, Policy, and
Organizational Capabilities focus on capabilities and practices at an organizational level that cross many
projects and can be implemented in parallel to product release cycles. The main benefits of developing
these capabilities include: improved security awareness and skills, increased standardization in security
development practices, internal security metrics for measuring effectiveness, and clearer executive
support for security in development.
Capability: Training
Overview
The average developer or tester may know very little about building secure software. Increasing their
knowledge of the executive commitment to security, common security concerns and pitfalls, and the
resources available to them is critical for enabling developers to create more secure code. Training is
therefore one of the foundational practices of the SDL.
Phase 1: Assess
The Assess phase involves identifying the requirements for training. Determine the proper training
content for your organization by considering:




Languages, platforms, and technologies employed.
Development, test environment, and tools.
Current level of security awareness.
Availability and knowledge of organizational security standards, including technology choices, code
libraries, and standard mitigations.
 Security threats and known past and current vulnerabilities.
 Number of developers and testers to be trained.
 Appropriate length of the course.
4
Phase 2: Identify
The next phase involves identifying the resources available for curriculum creation and delivery of a
“Basics of Secure Design, Development, and Test” or equivalent training course. Consider the following:




Discover existing organizational security standards, tools, libraries, and documentation.
Identify trainers within the organization or qualified third parties.
Select an appropriate base curriculum.
Identify organization-specific additions, modifications, or examples for base curriculum.
A number of resources are available for creating curricula:
 SDL Process Guidance—Pre-SDL Requirements: Security Training.
 A six-part video lecture version of the Microsoft “Basics of Secure Design, Development, and Test”
course is available on the CD-ROM supplement to the book, The Security Development Lifecycle:
SDL: A Process for Developing Demonstrably More Secure Software (ISBN: 9780735622142), by
Michael Howard and Steve Lipner (Microsoft Press, 2006).
 The Open Web Application Security Project (OWASP) also provides presentations and videos on a
variety of application security topics that may be useful in developing or supplementing training
content.
Phase 3: Evaluate and Plan
During the Evaluate and Plan phase, the detailed course curriculum and materials will be assembled and a
delivery mechanism decided upon. While in-person, instructor-led training with a hands-on lab
component is most effective, on-demand video or interactive computer-based training (CBT) may be a
preferable option for organizations with budget constraints or other complications, such as high staffing
volatility with contractors or offshore development centers.
Note
Training is not intended to make all of your engineering staff into security experts. One
of the most important goals of training is to make members of the engineering
organization aware of when they may have a security issue that needs investigation,
whom to ask for help, and what resources are available to them. The initial training contact with a
given staff member should also introduce the organization’s commitment to security and quality
gates. This will be different for every organization, so no security awareness and training course
should be completely “off-the-shelf.” If you elect to use a video or CBT, it is still recommended that
a member of the security expert team make in-person contact with all staff to introduce these
concepts (for example, as a 20-30 minute module added to general orientation and training for
new technical staff).
The general technical content was selected in the Identify phase, but several other practices and
checkpoints in the SDL Optimization Model should be completed before the course materials can be
finalized. The outputs of the Quality Gates and Secure Coding Policies practices in this guide should be a
part of the basic security course material.
5
Phase 4: Deploy
The goal of the Deploy phase is to deliver the training to the engineering organization. All developers and
testers in the pilot teams should have completed the basic training course.
Checkpoint: Training
Requirement
Determine training needs and content for developers and testers in the
organization; create or acquire appropriate curriculum.
Create or acquire “Basics of Secure Design, Development, and Test” or equivalent
course material.
Development and test staff for selected pilot projects successfully complete basic
security training.
If you have completed the steps listed above, your organization has met the minimum requirements of
the Standardized level for Training. We recommend that you follow additional best practices for training
addressed in the SDL Process Guidance at Microsoft MSDN.
Capability: Bug Tracking
Overview
In order to assess progress over time and to refine SDL practices, you must know what type of security
bugs are being found and what practices are most productive at uncovering security vulnerabilities. The
ability to effectively identify and track issues that have a security or privacy impact also assists in
enforcing quality gates.
Phase 1: Assess
The Assess phase involves gathering potential requirements for security bug classification. Bugs should be
security tagged in three categories:
 Security Cause: What was the root cause of the vulnerability? The following resources are useful
examples of general software and Web-application specific vulnerability categorizations:
 Web Application Security Consortium (WASC) Web Application Security Statistics project
 OWASP Top 10 2007 (The top 10 security vulnerabilities for 2007)
 The MITRE Corporation’s Common Weakness Enumeration
 The Open Source Vulnerability Database
Note
There is a wide variety of methods for categorizing the causes and types of
security vulnerabilities, but many are too complex to address at the Basic to
Standardized level. You don’t want to present staff members (who may have a
6
minimum of security training and experience) with a list of 50 options they may not
understand. When starting out, keep your categories broad, base them on the most
common vulnerabilities found in your organization’s code and products, and don’t shy away
from having a lot of bugs in the “Unknown“ or “Other” categories.
 Security Effect: Ask yourself: What security property can an attacker violate with the
vulnerability? Microsoft recommends categorizing security bugs with the STRIDE (Spoofing,
Tampering, Repudiation, Information Disclosure, Denial of Service, and Elevation of Privilege)
categorization. For more information, see Uncover Security Design Flaws Using the STRIDE
Approach.
 How Found: Identifying which practices are most productive in uncovering security
vulnerabilities helps guide implementation of the SDL. Categories should correspond to the SDL
and other development lifecycle practices, such as:
 Design Review
 Threat Modeling
 Code Analysis
 Code Review
 Functional Quality Assurance Test
 Third-Party Penetration Test
 Final Security Review
 Externally Reported
Phase 2: Identify
In the Identify phase, the security expert team should identify the relevant bug-tracking and management
systems, along with the people and processes necessary to add the new security categorizations.
Phase 3: Evaluate and Plan
During the Evaluate and Plan phase, the new categories are added to the system.
Phase 4: Deploy
In the Deploy phase, the engineering organization is trained to categorize new vulnerabilities with the
additional security categorizations. This might be part of the general security basics training, or it might be
simply by means of an e-mail campaign.
Checkpoint: Bug Tracking
Requirement
Bug databases and tracking software can record and classify vulnerabilities by
security cause, effect, and method of discovery.
7
If you have completed the step listed above, your organization has met the minimum requirements of the
Standardized level for Bug Tracking. We recommend that you follow additional best practices for bug
tracking addressed in the SDL Process Guidance at Microsoft MSDN.
Capability Area: Requirements and Design
Introduction
In Requirements and Design, new practices are first introduced into the lifecycle of specific products and
projects. As is generally well established in software engineering, the later in the product lifecycle a bug is
found, the more expensive it is to fix; this is perhaps even more true for security vulnerabilities. Insecure
designs, in particular, resemble other nonfunctional requirements, such as scalability. Without careful
attention, mistakes can easily propagate and become extraordinarily costly to fix. Assessing risk, analyzing
security and privacy early in the lifecycle, and identifying proper mitigations can drive the most dramatic
cost savings in a well-optimized SDL practice. Ongoing activity in Requirements and Design focuses on
capabilities and practices that continue to front-load security effort where it can be most effective.
Capability: Risk Assessment
Overview
Scarcity of resources is one of the challenges that organizations face when implementing the SDL. There is
a large amount of new development, the security expert team has limited staff, and there is limited
budget and time to introduce new, security-focused practices and processes. Risk Assessment is one of
the most important practices in moving from Basic to Standardized. It is also the key practice for
situational awareness, aiding in the effective allocation of available resources. And, it is one of the few
practices that should be universally applied at the Standardized level, as it will be a primary input into
selecting pilot projects for SDL implementation.
Note for small organizations or those with limited SDL-eligible products:
Some security managers may find that risk assessment is an intuitive exercise. This is
especially true for small organizations and for those with only a few products that face
serious threats to which they wish to apply the SDL. If the security expert team can
assess the relative risk of all of the projects, without resorting to more formal methods, the
questionnaire can simply reflect the intuitive framework that is already implicitly being used.
Phase 1: Assess
The Assess phase involves gathering the requirements for Risk Assessment. The overall goal should be to
categorize each new project as high, medium, or low risk. The risk score will represent the product of a
best-effort guess at two factors:
 What is the maximum possible impact of a security or privacy vulnerability in this project?
 How likely is a vulnerability to occur?
8
To assist in developing the risk questionnaire and scoring methodology, it may be helpful to gather the
following data:
 A history of past vulnerabilities and root-cause analyses, if available
 A catalog of the types of private information handled by the software and their relative degree of
sensitivity or their formal classifications
 A catalog of standard roles or privilege levels within the organization’s systems or levels at which
the organization’s products and systems may operate, for example:
 Anonymous user
 Authenticated, low-rights user
 Administrative user
 Customer service representative
 Service account
 User mode or Kernel mode
 Relevant categories indicating exposure to attack, for example:
 Exposed to anonymous users on the public Internet
 Exposed to authenticated users on the public Internet
 Exposed to authenticated users locally
 Exposed to anonymous users on the corporate intranet
 Exposed to authenticated users on the corporate intranet
 Exposed only to administrators and servers on an isolated network
 Other organization-specific security differentiators, including type of application, deployment
method, or extensibility model
At the Assess stage, an estimate should also be made of the appropriate size and degree of detail for the
risk questionnaire, based upon an understanding of the diversity of products and projects and on the
organizational tolerance for additional project management overhead.
For more information, see the SDL Process Guidance—Phase 2: Design Phase: Risk Analysis and Chapter 8
and Chapter 9 of The Security Development Lifecycle.
Phase 2: Identify
The Identify phase involves targeting the appropriate documentation method for creating, auditing, and
enforcing the risk questionnaire. This might be a Microsoft Office Word document, an InfoPath® form, or
a simple Web application, or opportunities might exist to integrate directly into Application Lifecycle
Management (ALM), Enterprise Resource Planning (ERP), or other project management systems that drive
the rest of the product lifecycle.
Phase 3: Evaluate and Plan
During the Evaluate and Plan phase, the detailed questionnaire is designed, and relative weightings for
each question are assigned. The sample Risk Assessment Document, included on the CD-ROM supplement
to The Security Development Lifecycle, is a good starting place, although a much simpler questionnaire
may suffice for most organizations. The sample questionnaire is heavily oriented towards developing
Win32®-based applications and components of the Windows® operating system. For your organization,
emphasize issues that have produced bugs in the past or that can lead to high-severity vulnerabilities as
defined in your quality gates.
9
Some sample questions for a Web application might include:
 Does the application accept and display user-contributed data? (cross-site scripting [XSS] risk)
 Does the application share data with partners or third parties? (information disclosure, spoofing
risks)
 Does the application allow new user accounts to be created?
 Does the application allow for viewing, adding, or updating of Personally Identifiable Information
(PII)?
 Does the application implement new or custom ways for users to log on?
A sample privacy questionnaire is also available at Appendix C of the SDL Process Guidance.
Organizing the document into large sections with optional, follow-up questions is encouraged. It should
be easy and should take no more than a few minutes for a team that isn’t taking significant security risks
to complete the questionnaire, although teams doing more complicated and security-hazardous projects
may take somewhat longer.
It may be useful, when calculating final scores, for the security expert group to add fixed handicaps for
certain feature areas or for teams known to have a history of producing vulnerabilities or high-risk code, if
such historical data is available.
Warning
There is no guaranteed risk-assessment methodology that will produce completely
accurate results. Especially at first, these techniques will produce very inexact estimates.
Like any triage process, the intent is not to precisely order features but to help the
security expert group make broad categorizations and allocate limited resources. Once you have
gathered some data, adjust the weighting of the questions until the results correspond to your
common sense understanding of your systems and historical vulnerability data. Always remember
that SDL practices need constant, iterative refinement to adjust to a changing threat environment.
Phase 4: Deploy
The goal of the Deploy phase is to deliver the questionnaire and to get responses from 80 percent of new
projects during the project initiation or requirements lifecycle phase in order to select the pilot SDL
projects.
Note
At more advanced maturity levels, some of this risk assessment may be automated for
existing or legacy projects. If it isn’t possible to develop and require teams to complete
such a questionnaire, early and inexpensive automation may be a second-best
alternative. If source code files can be correlated to projects, for example, the rough number of
hits from a security code scanning tool, or even a “grep” (search tool) for internally sensitive API
usage, might serve as a private risk-assessment methodology, even if the false positive rate is too
unacceptable to push such findings down to the engineering teams.
10
Checkpoint: Risk Assessment
Requirement
A risk questionnaire exists, is a mandatory part of the project initiation or
requirements lifecycle phase, and the SDL pilot projects have been identified
based upon this questionnaire.
If you have completed the step listed above, your organization has met the minimum requirements of the
Standardized level for Risk Assessment. We recommend that you follow additional best practices for risk
assessment addressed in the SDL Process Guidance at Microsoft MSDN.
Capability: Quality Gates
Overview
Following the Risk Assessment phase, it should be determined whether a feature or product qualifies for
the SDL according to the SDL policy. At the Standardized level, participation in the SDL is opportunistic
rather than mandated and applies only to the pilot projects selected for their risk ranking or ability to
absorb and perform the new required practices. For these projects, it is important to define the final
security and privacy release criteria so that accurate estimates of security practices can be built into the
project timeline. The quality gates clearly define what type of security and privacy issues are important to
the organization and which bugs must be fixed, can be mitigated, and can be left unfixed.
Phase 1: Assess
The Assess phase involves gathering requirements to build the bug ranking guideline. Review the sample
security and privacy quality gates (or bug bars) from the SDL Process Guidance to gain an understanding
of the form and content of the document and to assess how it can be incorporated into or supplement
existing bug categorization and ranking tools in your organization.
 SDL Process Guidance—Appendix M: SDL Privacy Bug Bar (Sample)
 SDL Process Guidance—Appendix N: SDL Security Bug Bar (Sample)
Phase 2: Identify
In the Identify phase, the security expert team should gather the detailed requirements for defining
meaningful risk and bug ranking classifications and categories. This will include input from:
 Executives and senior management.
 Those responsible for creating high-level requirements for projects and products.
 Relevant regulatory or contractual requirements (for example, Health Insurance Portability and
Accountability Act [HIPAA] or Payment Card Industry [PCI] compliance).
 Sample data from past incidents and security vulnerabilities.
For more information, see the Microsoft Privacy Guidelines for Developing Software Products and
Services and the Microsoft Security Response Center Security Bulletin Severity Rating System.
11
Phase 3: Evaluate and Plan
During the Evaluate and Plan phase, the quality gates are created. These can take the form of a single
document covering both privacy and security issues, or they may be distinct documents. Distinct
documents are recommended if, for example, major product areas in the organization are different
enough to merit distinct security quality gates, but privacy requirements are the same across the
organization.
Phase 4: Deploy
In the Deploy phase, these quality gates are rolled out to the SDL pilot projects as the standard for
categorizing and prioritizing security and privacy vulnerabilities.
Checkpoint: Quality Gates
Requirement
A definition of security and privacy bug classifications exists as release criteria for
the SDL pilot projects.
If you have completed the step listed above, your organization has met the minimum requirements of the
Standardized level for Quality Gates. We recommend that you follow additional best practices for quality
gates addressed in the SDL Process Guidance at Microsoft MSDN.
Capability: Threat Modeling
Overview
It is recommended that threat models be created for system components that have large attack surfaces,
handle sensitive data, and are relied upon by other components to enforce security goals or have complex
interactions with untrusted systems. Teams will almost always need expert assistance and moderation in
the creation of their first threat model. Threat Modeling is a formalized process for cataloging the security
requirements and design of a system, evaluating its attack surface, and identifying mitigations and
unmitigated threats. Threat Modeling serves two purposes:
1.
2.
The Threat Modeling process can be an effective way of identifying design vulnerabilities and
fixes for those vulnerabilities, outside of the narrow categories of functional security
requirements addressed by previous practices in the Requirements and Design phases.
The threat model document is a very useful artifact that can be used to guide creation of test
cases, to focus penetration testing and code review, and to communicate the security
specifications of the system to dependent feature teams and future engineers, reviewers, and
auditors.
Phase 1: Assess
Assess the readiness and availability of resources on the security expert team to perform Threat Modeling
for the selected SDL pilot projects.
12
Phase 2: Identify
For the SDL pilot projects, identify the features to be reviewed using the Risk Assessment Capability. Many
security bugs happen at the interface between components owned by different teams, where the security
requirements and guarantees made by each side of the contract are not clearly defined. Threat Modeling
core system components with many dependencies helps expose this kind of vulnerabilities and provides
valuable documentation for all of the clients of these key systems.
Phase 3: Evaluate and Plan
Arrange time to meet with the appropriate project managers and development leads for the selected
projects. In planning for Threat Modeling, any available requirements and design documents should be
reviewed by the experts in advance of conducting the Threat Modeling session with the team.
Phase 4: Deploy
A representative from the security expert team conducts the Threat Modeling session with the team. For
more information on building threat models, see:









Microsoft SDL Threat Modeling Tool
Reinvigorate your Threat Modeling Process
Uncover Security Design Flaws Using the STRIDE Approach
SDL Process Guidance—Phase 2: Design Phase: Establish and Follow Best Practices for Design
Threat Modeling (ISBN: 9780735619913), by Frank Swiderski and Window Snyder (Microsoft Press,
2004)
Posts on Threat Modeling from the SDL Blog
The essentials of Web application threat modeling
Fending Off Future Attacks by Reducing Attack Surface, an MSDN article on the process for
determining the attack surface.
Measuring Relative Attack Surfaces, an in-depth research paper
Checkpoint: Threat Modeling
Requirement
The security expert team is capable of identifying projects that would benefit
from Threat Modeling and of providing direct assistance to product teams in
creating threat models.
If you have completed the step listed above, your organization has met the minimum requirements of the
Standardized level for Threat Modeling.
13
Capability Area: Implementation
Introduction
Implementation focuses on security measures to eliminate and reduce the impact of vulnerabilities in the
construction of software. Ongoing Implementation activity focuses on improving the use and
sophistication of security tools and policies when building software.
Capability: Secure Coding Policies
Overview
Nearly every organization will have a set of guidelines to aid in the consistency, quality, and
maintainability of code. For the SDL, not only should such policies provide guidance about secure
development practices but, as much as possible, they should also be backed by tests or audits to ensure
that they are actually being followed. At the Standardized phase, guidelines and policies are introduced.
Mandatory enforcement of more sophisticated practices may come at later stages, but some of the easyto-implement practices, like strengthening of compiler defenses, should have few obstacles to immediate
enforcement.
Phase 1: Assess
In the Assess phase, evaluate the set of languages, compilers, and tools that are in scope for secure coding
policies. Do not forget to include languages, like JavaScript, which are embedded in other artifacts or are
executed outside of the standard system context, but which may still present risks to customers and end
users.
Phase 2: Identify
In the Identify phase, note the relevant practices, tools, and checklists for the assessed areas of coverage.
The Standardized level requires:






Using the latest compiler and linker because important defenses are added by the tools.
If using Visual C++®, use Visual Studio® 2005 SP1 or later.
Compiling with appropriate compiler flags.
Compiling clean at the highest possible warning level.
Compiling with –GS to detect stack-based buffer overruns.
Linking with appropriate linker flags: /NXCOMPAT to get NX defenses, /DynamicBase to get ASLR,
and /SafeSEH to get exception handler protections.
 For GCC, using at least GCC 4.1 with –fstack-protector or –fstack-protector-all.
Implementing the use of Banned.h or a subset to prevent use of dangerous APIs in C and C++ code is a
low-cost step for new code. Alternately, a simple, free code scanner, such as RATS, can be utilized as part
of the development code review process to manually identify dangerous API usage. For .NET code, use
FxCop.
14
For more details, see the following resources:




Protecting Your Code with Visual C++ Defenses
GCC extension for protecting applications from stack-smashing attacks
Buffer overflow protection
Banned.h, included on the CD-ROM supplement to The Security Development Lifecycle, Howard and
Lipner, 2006
 FxCop
 Secure Coding Guidelines for the Java Programming Language, version 2.0
 Writing Secure Code, Second Edition (ISBN: 9780735617223), by Michael Howard and David
LeBlanc, Appendix D: “A Developer’s Security Checklist,” pp 731-735 (Microsoft Press, 2002)
Phase 3: Evaluate and Plan
In the Evaluate and Plan phase, decide which of the practices are most appropriate and how many can be
productively applied and accepted by the development organization. In order to minimize the perceived
burden of additional processes, it may be useful to attempt to integrate tool usage into existing practices
in the development process, such as adding a security code scanner to existing code review tools.
Phase 4: Deploy
Guidelines and policies are rolled out to the development organization and should be included as part of
the Training curriculum. It may also be helpful, though not required at the Standardized level, to
encourage or require developers to read books that provide a comprehensive view of secure
development best practices, such as Writing Secure Code, Second Edition.
Checkpoint: Secure Coding Policies
Requirement
The SDL pilot teams and high-risk projects in native code utilize appropriate
compiler defenses.
Banned.h is used to exclude banned APIs from new code or security code scanners
are used in conjunction with code review to identify dangerous API usage.
If you have completed the steps listed above, your organization has met the minimum requirements of
the Standardized level for Secure Coding Policies. We recommend that you follow additional best
practices for secure coding policies addressed in the SDL Process Guidance at Microsoft MSDN.
15
Capability: Cross-Site Scripting (XSS) and SQL Injection Defenses
Overview
Cross-Site Scripting (XSS) and SQL Injection are two of the most common and severe classes of Web
application vulnerabilities. A variety of standard libraries and tools exists to identify and eliminate these
vulnerability classes and should be applied, as relevant, to Web applications.
Phase 1: Assess
The Assess phase begins by determining the major platforms, languages, and frameworks to target. Many
of the tools available will also vary by platform and development environment, so it is important to assess
both what is in use and how much variation exists across each development group’s tool chains.
Phase 2: Identify
In the Identify phase, the set of tools available and appropriate for the target bug classes and
development platforms are investigated, and a few are selected for piloting or competitive analysis.
A variety of tools suitable for both native and managed code development on the Windows platform and
with Visual Studio are available at the Microsoft SDL Tools Repository.
The following is a list of useful free tools provided by Microsoft and other vendors. This list is not intended
to be comprehensive, as there are numerous commercial code analysis tools in the market.
In the source code analysis space, Microsoft provides the following tools:
 Microsoft Source Code Analyzer for SQL Injection.
 XSS Detect [Beta] is a static-code analysis tool that helps identify the cross-site scripting security
vulnerabilities found within Web applications.
Passive Web application proxy scanners can be useful in identifying areas where XSS and other
vulnerabilities may be present. Two free and open source tools in this space are:
 RATS—Rough Auditing Tool for Security
 ProxMon from iSEC Partners
At the code and framework level, several libraries and filters exist that can be integrated directly into the
application:
 Microsoft Anti-Cross Site Scripting Library V1.5 for .NET applications
 AntiXSS for Java from Gotham Digital Science
 OWASP Reform Library
Phase 3: Evaluate and Plan
During the Evaluate and Plan phase, the security expert team tests the tools against the environment.
Tools should be evaluated for ease of integration into the environment—to see if they would have
prevented known vulnerabilities and eliminated false positives.
16
Phase 4: Deploy
The Deploy phase involves handing the tools off to selected members or teams in the development
organization and collecting feedback as to the effectiveness of the tools.
Checkpoint: XSS and SQL Injection Defenses
Requirement
The security expert team is working with the SDL pilot teams to implement XSS
and SQL Injection defenses for Web applications, specifically input validation,
output encoding, and the use of parameterized queries.
If you have completed the step listed above, your organization has met the minimum requirements of the
Standardized level for XSS and SQL Injection Defenses. We recommend that you follow additional best
practices for XSS and SQL Injection Defenses addressed in the SDL Process Guidance at Microsoft MSDN.
Capability Area: Verification
Introduction
Verification is focused on security practices to discover weaknesses and to verify security once software
construction is functionally complete. This phase is important because it helps to discover and eliminate
vulnerabilities that might have been introduced into the code at an earlier stage. Ongoing Verification
activity focuses on deeper and more sophisticated use of the tools and a more comprehensive code
review.
Capability: Dynamic Analysis and Application Scanning (Web
Applications)
Overview
Web applications are becoming a major target of choice for attackers. They tend to suffer from a variety
of common attacks. WhiteHat security noted in a 2008 report that 70 percent of the Web sites it
examined suffered from XSS vulnerabilities. Attackers have effective, automated tools to scan for such
vulnerabilities, so defenders must utilize similar tools to find and eliminate these kinds of easily
discoverable weaknesses at the verification stage.
Phase 1: Assess
Most available scanners will work generically on all Web applications, regardless of the underlying
technology platform. However, there may be some edge cases or framework-specific attacks that one tool
or another can cover more deeply, and some scanners may offer better compatibility and coverage for
heavily AJAX-enabled applications. These basic characteristics of the applications to be scanned should be
17
assessed to help with tool selection. Also assess which classes of vulnerabilities you expect the tools to
assist in discovering.
Phase 2: Identify
Identify tools to evaluate. Tools in this area vary from fully automated scanners to managed services, to
semi-manual browser toolbars that can integrate into the normal functional testing process. Most
offerings in this area are commercial products, but free or open source tools to investigate include:
 Scrawlr, a vulnerability scanner co-developed by Hewlett-Packard and Microsoft for identifying SQL
Injection vulnerabilities in Web sites.
 Grendel-Scan by David Byrne.
Multiple tools may be beneficial to increase coverage, and certain tools may only offer specialized testing
for one class of vulnerability and should be used in conjunction with a more general scanner. Some tools,
such as Nikto, can be used to scan for Web server configuration vulnerabilities, but they do not target the
discovery of vulnerabilities in custom applications. These should always be used in conjunction with an
application-specific scanner.
Phase 3: Evaluate and Plan
In the Evaluate and Plan phase, decide which of the identified tools are most appropriate and how many
can be productively applied and accepted by the development organization. It will be helpful if previous
versions of the application with known vulnerabilities can be tested, or if vulnerabilities can be
deliberately introduced, to evaluate the relative effectiveness of each tool. At the Standardized level, one
of the major challenges of this kind of tool is a high false-positive rate. Consider the ability of the test and
development organizations to consume and act on the output of the tool, and avoid tools that require
large amounts of effort to comb through mountains of unimportant or illusory bugs.
Phase 4: Deploy
During the Deploy phase, roll out the tool to the test organization, or have the security expert group apply
it to selected pilot projects. Gather data on the tool’s effectiveness, and tune it with the goal of making it
an acceptable part of mandatory practices.
Requirement
The security expert team is working with the SDL pilot teams to evaluate and
deploy dynamic scanning tools for Web applications.
Checkpoint: Dynamic Analysis and Application Scanning (Web Applications)
If you have completed the step listed above, your organization has met the minimum requirements of the
Standardized level for Dynamic Analysis. We recommend that you follow additional best practices for
dynamic analysis addressed in the SDL Process Guidance at Microsoft MSDN.
18
Capability: Fuzzing
Overview
Complex parsers for file formats and custom network protocols are a frequent cause of high-severity
vulnerabilities, due to buffer overrun, integer overflow, and related issues possible in languages like C and
C++. Identifying these issues with traditional testing and code review is a time-consuming and error-prone
process. Fuzz testing, the automated creation and execution of many test cases created by targeted and
random inputs, has proven itself as a cost-effective method for identifying security vulnerabilities. Widely
employed by attackers, fuzzing should be proactively utilized by defenders in verifying their software.
For more information, see the following resources:
 Fuzz testing
 Fuzz Testing at Microsoft and the Triage Process
 Fuzzing: Brute Force Vulnerability Discovery (ISBN: 0321446119), by Michael Sutton, Adam Greene,
and Pedram Amini (Addison Wesley Professional, 2007)
Phase 1: Assess
The Assess phase involves identifying what kinds of parsers will be required to be fuzzed. At the
Standardized level, fuzzing is required for all new file parsers written in C or C++ that accept data across a
trust boundary. Depending on the history of vulnerabilities, it may be desirable to extend this mandate to
cover all such parsers in legacy code, in addition to network protocol parsers exposed to unauthenticated
data.
Phase 2: Identify
In the Identify phase, the security expert team should identify candidate fuzzers. A wide variety of free
and commercial tools are available to satisfy many requirements and styles:
 Fuzzing Software, a list of (mostly free) tools from the book Fuzzing: Brute Force Vulnerability
Discovery by Sutton, Greene, and Amini, including FileFuzz.
 Peach 2 is a free, easy-to-use, extensible fuzzing platform. Peach is capable of fuzzing just about
anything you can imagine, including network based services, RPC, COM/DCOM, SQL Stored
Procedures, and file formats.
 File Fuzzers, Fuzzbox, Windows IPC Fuzzing Tools, and Forensic Fuzzing Tools are free fuzz testing
libraries from iSEC Partners.
 Defensics, commercial blackbox, negative testing tools for developers from Codenomicon.
Phase 3: Evaluate and Plan
During the Evaluate and Plan phase, the security expert team evaluates a few candidate fuzzing
frameworks and selects one to use.
19
Phase 4: Deploy
In the Deploy phase, the selected tool is deployed against eligible parsers. At the Standardized level, the
security expert team may do this, or it may train the relevant teams in the testing organization to do so.
Bugs identified through fuzzing, and the remediation of those bugs, should be tracked and verified.
Checkpoint: Fuzzing
Requirement
Custom file format parsers implemented in native C or C++ code have been fuzzed
for the SDL pilot projects.
If you have completed the step listed above, your organization has met the minimum requirements of the
Standardized level for Fuzzing. We recommend that you follow additional best practices for fuzzing
addressed in the SDL Process Guidance at Microsoft MSDN.
Capability: Penetration Testing
Overview
An outside perspective can help clarify the current state of security maturity and the types of
vulnerabilities that other processes are not preventing. Pilot products with the greatest risk exposure for
the organization should undergo penetration testing by outside experts.
For more information, refer to the following resources:
 Penetration Testing in MSDN Magazine
 Building Security In: Software Penetration Testing in IEEE Security & Privacy
 The Art of Software Security Assessment (ISBN 0-321-44442-6), by Mark Dowd, John McDonald, and
Justin Schuh. See Chapter 4: “Application Review Process,” especially pp. 111-164, “Code-Auditing
Strategies,” “Code-Auditing Techniques,” “Code Auditor’s Toolbox,” and “Case Study: OpenSSH”
(Pearson, 2006)
Phase 1: Assess
The Assess phase involves identifying which software modules to focus on for penetration testing efforts,
developing a budget for outside penetration testing, and determining the necessary skills and criteria for
potential vendors.
Phase 2: Identify
In the Identify phase, targets are selected for penetration tests, and vendors are selected to respond to
the Request for Proposal (RFP). Members of the Microsoft SDL Pro Network program can be used as a
starting point in identifying appropriate vendors.
20
Phase 3: Evaluate and Plan
During the Evaluate and Plan phase, an RFP describing the testing to be performed is issued, vendor
responses are reviewed, and a vendor selection is made. Plan on having a dedicated environment for
testing that resembles production operating conditions as closely as possible.
Phase 4: Deploy
Execute the penetration test, and perform recommended remediation.
Checkpoint: Penetration Testing
Requirement
Penetration testing by third parties, as appropriate, is completed.
If you have completed the step listed above, your organization has met the minimum requirements of the
Standardized level for Penetration Testing. We recommend that you follow additional best practices for
penetration testing addressed in the SDL Process Guidance at Microsoft MSDN.
Capability Area: Release and Response
Introduction
This capability area focuses on security practices performed for final security assurance before release. It
also helps you to prepare for and execute responses in the event that security vulnerabilities are
discovered in production software. The Final Security Review (FSR) is conducted to verify that all of the
relevant SDL requirements have been satisfied, facilitating more effective governance of security
assurance. After release of the software, response activities are critical to minimize risk to customers and
to remediate vulnerabilities in a less costly and more orderly manner.
Capability: Final Security Review
Overview
The Final Security Review (FSR) is the last chance to determine if a product is ready for release to
customers. The FSR is performed by the central security expert group. It is not primarily an activity
designed to discover bugs, rather it is a governance activity to identify whether discovered bugs have
been properly managed and whether security procedures have been followed.
For more information, refer to the following resources:
 SDL Process Guidance—Phase 5: Release Phase: Final Security Review and Privacy Review
 The Security Development Lifecycle, by Howard and Lipner, Chapter 14: “Stage 9—The Final
Security Review “ and Chapter 16: “Stage 11—Product Release”
21
Phase 1: Assess
In the Assess phase, determine what resources are available to perform the FSR. Ask yourself: How much
time will the central security expert team have available, and how many projects can reasonably be
covered? How much time in the product development lifecycle can be devoted to a FSR? For teams that
have been diligently following the SDL practices, the FSR should take no more than a day; however,
expect that some teams will have left some practices incomplete or bugs unfixed, and they will have to
spend time resolving those.
Also, verify that you have determined which quality gates must be enforced for software to be released to
customers.
Phase 2: Identify
Next, identify which features or projects are eligible for an FSR. It is important to do this as early as
possible so that appropriate time can be built into the release cycle. Occasionally, a project may need to
be delayed to complete unfinished security work, but this should definitely not be the norm in the SDL.
After the resource budget has been assessed and the design requirement reviews have been completed,
it should be possible to pick FSR candidates. Teams that had high-risk scores or a history of security
vulnerabilities but that have not reported any security bugs during the rest of the development lifecycle
are also prime candidates for an FSR.
Phase 3: Evaluate and Plan
During the Evaluate and Plan phase, teams are notified of their eligibility, and the FSR is scheduled.
Phase 4: Deploy
The FSR is conducted prior to release of the software, likely concurrent with functional regression testing.
The central security expert team meets with the development team and assesses their execution of the
required SDL practices. At the Standardized level, it is required that quality gates for bugs released to
production be enforced and that exceptions to these gates be formally reviewed and approved. Bugs
prioritized such that they do not fall under mandates should also be reviewed to ensure that they have
been properly rated according to the quality gates. The FSR should also assess how well teams have
complied with the other SDL mandates, such as fuzzing, secure coding policies, and other current security
practices.
Requirement
The security expert team can use risk analysis and results from earlier SDL
practices to identify candidate projects for Final Security Review.
Quality gates exist for number and severity of bugs released to production.
The security expert team verifies that products adhere to internal policies and
meet relevant external regulatory requirements for privacy and security.
The security expert team reviews and approves all of the requested exceptions to
security policies and quality gates.
22
Checkpoint: Final Security Review
If you have completed the steps listed above, your organization has met the minimum requirements of
the Standardized level for the Final Security Review. We recommend that you follow additional best
practices for code review addressed in the SDL Process Guidance at Microsoft MSDN.
Capability: Project Archiving
Overview
To better facilitate debugging security vulnerabilities reported to you, you are strongly advised to upload
debugging symbols to a central, internal site that can be easily accessed by developers at a later stage.
Debuggers use symbols to turn addresses and numbers into human-readable function names and variable
names. This debug symbol requirement applies to all publicly released binaries.
Phase 1: Assess
The Assess phase involves identifying platforms and technologies where symbol archiving is relevant.
Phase 2: Identify
In the Identify phase, the specific projects on these platforms that ship public binaries are selected.
Phase 3: Evaluate and Plan
During the Evaluate and Plan phase, a central location (symbol server or set of archive directories) is
identified and created to store debug symbols for all versions of publicly shipped binaries.
Phase 4: Deploy
Symbols are archived at every release and used by production support teams to assist in researching
security issues.
Checkpoint: Project Archiving
Requirement
Debug symbols are archived in a central location for all publicly shipped binaries.
If you have completed the step listed above, your organization has met the minimum requirements of the
Standardized level for Project Archiving. We recommend that you follow additional best practices for
project archiving addressed in the SDL Process Guidance at Microsoft MSDN.
23
Capability: Response Planning and Execution
Overview
Even a perfectly executed SDL cannot prevent all vulnerabilities. Some may be missed, and vulnerabilities
may emerge as new classes of attack are discovered. For an organization just beginning to implement an
SDL process, an orderly response plan is a key competency, not only for externally reported bugs, but also
for those discovered internally in existing products and systems through increased security awareness and
review.
For more information, refer to the following resources:
 SDL Process Guidance—Phase 5: Release Phase: Response Planning
 SDL Process Guidance—Post-SDL Requirement: Response
 The Security Development Lifecycle, by Howard and Lipner, Chapter 15: “Stage 10—Security
Response Planning” and Chapter 17: “Stage 12—Security Response Execution”
 Responding to Security Incidents (The Microsoft Security Response Center [MSRC])
 FIRST (Forum of Incident Response and Security Teams)
 IT-ISAC (Information Technology: Information Sharing and Analysis Center)
Phase 1: Assess
The Assess phase involves identifying what the response process must encompass. The general
characteristics of your application will also shape the response plan. Rolling out security updates will
typically be much simpler for Web applications or software-as-a-service (SaaS) offerings than it will be for
“shrink-wrapped” products. Consider the following questions when setting goals for incident response
and determining who and how to involve various principals:
 Do mechanisms exist to automatically notify customers of updates or to automatically apply
patches?
 What is the population of vulnerable users for each product, and how long will it take them to
upgrade?
 Do you ship redistributable components (such as libraries or COM objects) that may be a part of
other products?
 What third-party code or binary objects are included in your products that may need servicing?
 Do you ship ActiveX® controls that are not site-locked?
 Will there be potential compatibility impacts for partner or third-party products?
 Will there be compatibility impacts for older versions of your own products?
 What obligations do you have to provide security updates for older versions of your products?
 Are there special characteristics of your products (for example, certified configurations) that may
prevent customers from being able to patch or upgrade?
24
Phase 2: Identify
In the Identify phase, components that may require security servicing are cataloged. Ask yourself: What
are the target response times for each, and how will they be serviced? What third-party components may
need updating? What kinds of bugs require a special release, and which can wait until the next regular
release cycle?
Phase 3: Evaluate and Plan
During the Evaluate and Plan phase, a security incident response plan is created. For the Standardized
phase, there are two major parts to this planning. The first is to identify the owners of modules which may
require servicing. For legacy code, this division may be fairly broad. For new projects, part of the release
process should be to record an incident response owner for every component and the means to identify
that component from customer reports, by URL path, binary module, or otherwise, as appropriate for
your products. Contact information that is available around the clock, 365 days a year, should be recorded
for the development and test staff and for the relevant operations team. An emergency response will
require all three: operational personnel to deploy emergency remediation, notify affected customers, or
shut down affected functionality; the development team to do root-cause analysis and develop a fix; and
the test team to verify the fix.
Second, the security expert team or operations should be prepared to monitor and triage mail sent to
public disclosure addresses. You may want to consider publishing a responsible disclosure policy,
especially for online services. See the Microsoft Online Services Security Researcher Acknowledgements
FAQ for an example policy.
Phase 4: Deploy
To deploy the incident response plan, a public contact point for security issues is publicized to collect
notices from the public or security research community. Posting a notice on your company’s Web site is a
good start. Common e-mail address choices for responsible disclosure of vulnerabilities contacts (when a
specific person cannot be otherwise identified) include secure@yourcompanyname.com and
security@yourcompanyname.com. Even if you choose to publicize a different contact address, mail to
these addresses should be monitored.
Checkpoint: Response Planning and Execution
Requirement
New code and projects have recorded contacts for incident response, and a
security response first responder contact point is made available to clients and
the general public.
If you have completed the step listed above, your organization has met the minimum requirements of the
Standardized level for Response Planning and Execution. We recommend that you follow additional best
practices for response planning and execution addressed in the SDL Process Guidance at Microsoft MSDN.
25