Testing Strategy - Project Management

advertisement
[Project Name]
Testing Strategy
Prepared By:
Prepared For:
Date Created:
Last Updated:
(R)esponsible
(A)uthority
(S)upport:
(C)onsult:
(I)nform:
Document Overview
<Author’s Name>
<Department Name>
<Month Day, Year>
<Month Day, Year>
RASCI Alignment
<names>
<names>
<names>
<names>
<names>
[PROJECT NAME]
TESTING STRATEGY
Revision Log
Revision
Date
Initials
1.0
1.1
<MM/DD/YY>
<initial>
Description of Revision
Initial Draft
This document is a copy, effective only on the downloaded date. The official, updated template always resides on the PM101 site.
File Name: Document1
Page 2 of 14
Last saved: 3/14/16
TESTING STRATEGY
[PROJECT NAME]
Table of Contents
Contents
1) OVERVIEW
1.1) TEST OBJECTIVES AND SCOPE
1.2) ASSUMPTIONS
4
4
4
2) TESTING TIMELINE
6
3) TEST PHASES
6
3.1) UNIT TESTING
4) TEST MANAGEMENT
4.1) ROLES AND RESPONSIBILITIES
4.1.1) Test Lead
4.1.2) Testers
4.1.3) Functional Team
4.1.4) Development Team/Technical Team
4.1.5) SMES and Super Users
4.2) TEST REPORTS
6
6
6
6
7
7
7
8
8
5) ENTRY AND EXIT CRITERIA
8
6) TEST ENVIRONMENT AND TOOLS
9
6.1) TOOLS
7) DEFECT MANAGEMENT PROCESS
7.1) DEFECT TRACKING WORKFLOW
7.2) SEVERITY/PRIORITY
7.3) REPORT DEFECTS
7.4) REVIEW/ASSIGN DEFECTS
7.5) FIX DEFECTS
7.6) RETEST
7.7) ANALYZE DEFECT DATA
8) DOCUMENT SIGN OFF
9
9
10
10
11
12
12
12
13
14
This document is a copy, effective only on the downloaded date. The official, updated template always resides on the PM101 site.
File Name: Document1
Page 3 of 14
Last saved: 3/14/16
TESTING STRATEGY
[PROJECT NAME]
1) Overview
The Testing Strategy defines and communicates the approach to testing in support of <Project
Name>. The Testing Team will ensure that the approach is implemented and adhered to by
the entire project team.
1.1) Test Objectives and Scope
Test Objectives:
List what you are testing in this section.
Example: To migrate the Microsoft NT Server 4.0 operating system environment to
Windows Server 2003 environment, component by component, keeping the access
control list and Exchange permissions intact.
The scope of Testing:

What are the functionalities or features to be tested? List detailed items
below.

Example: client computer hardware with supported configuration or a
standard configuration

Forms, reports, interfaces, conversions, and enhancements are some other
items to consider.

Security and workflow are other items to consider.
Out of scope for Testing:

What are the functionalities or features NOT to be tested? List detailed items
below.

Example: client computer hardware that do not meet the standard
configuration.
1.2) Assumptions
The Testing Strategy presented in this document is based on the following assumptions:

List all the assumptions or conditions of testing. Below are examples you can
use or consider.

<Vendor> will participate in the test planning, script development, test
execution and test reporting to help facilitate knowledge transfer.
This document is a copy, effective only on the downloaded date. The official, updated template always resides on the PM101 site.
File Name: Document1
Page 4 of 14
Last saved: 3/14/16
TESTING STRATEGY
[PROJECT NAME]

<Vendor> resources for integration testing and User Acceptance Testing will
be available full time.

Configuration will be frozen during integration test ing and User Acceptance
Testing.

The legacy system environments will be made available in time for
integration testing.

<Vendor> will provide selected production data for testing as identified
jointly <vendor> and University of Chicago.

<Vendor> will provide licenses for any testing tools recommended and
subsequently used.

All deliverables within each testing phase will have an ap proval and sign-off
process.

The required number of stable testing environments and testing tools will be
available as needed throughout the testing phases.

Appropriate business and IT resources will be assigned as req uired for all
phases of testing.

Proper actions will be taken to resolve or offer alternate solutions to
defects/exceptions in a timely manner.

Funding is available to purchase additional test software licenses, if
necessary.

The creation and execution of all test cases will be performed by the
functional and development teams.

All business process documentation will be created and made available for
the preparation and execution of Integration testing.

Standards and training in the use of tools and testing methodologies will be
made available to all functional teams.

A standard transport/migration and build strategy will be followed.

Testing will be conducted by resources knowledgeable in <application>
transactions.

The testing resources will be trained in the use of Test Tools.
This document is a copy, effective only on the downloaded date. The official, updated template always resides on the PM101 site.
File Name: Document1
Page 5 of 14
Last saved: 3/14/16
TESTING STRATEGY
[PROJECT NAME]
2) Testing Timeline
What Testing Phases are you going to implement and what are their timelines?
List that information in this section.
Example: We’ll execute Unit and Integration Testing for the Project A.
Unit Testing will start on 1/1/2011 and end on 2/28/2011.
Integration Testing will start on 3/15/2011 and end on 4/31/2011.
Performance Testing will be eliminated since it’s a vendor housed server and the vendor
monitors and ensures the server’s performance according to the contract that will be honored.
3) Test Phases
3.1) Unit Testing
Use this section to elaborate on each testing phases you’ll run if there is additional information
you want to share with the project stakeholders. Also, if there are certain conditions that need
to be present or met before the test can begin, list them here as well.
Add additional header for each testing phase you’ll run for your project.
For an overview of each testing phases, take a look at PM101-Testing Overview.
4) Test Management
This section outlines the roles and responsibilities for the management and execution of
testing, as well as the reports that will be produced to help manage the testing process
if applicable.
4.1) Roles and Responsibilities
4.1.1) Test Lead
The Test Lead, <Name of Test Lead>, is responsible for the management of all
testing phases and cycles that include planning, overseeing execution, managing
defects, and reporting test status.
For an overview of each testing phases, take a look at PM101-Testing Overview.
This document is a copy, effective only on the downloaded date. The official, updated template always resides on the PM101 site.
File Name: Document1
Page 6 of 14
Last saved: 3/14/16
TESTING STRATEGY
[PROJECT NAME]
4.1.2) Testers
The Testers are people who are responsible for creating, executing, and updating
test scripts and reporting issues/defects. They report their testing status to Test
Lead.
Testers for <Test Phase> for <Project> are <List names of testers>.
Example: Testers for Unit Testing for Project A are John Doe from Network
Group, Jane Doe from Email Server Group, and Jason Smith from Security Group.
4.1.3) Functional Team
The Functional Team reviews test scenarios or scripts if time permits to ensure
that the complex functionalities and features are being tested correctly. They
also assist testers by answering questions regarding business specifications and
processes when clarification of the functions/features is needed. They also
collaborate with testers to review results of complex test scripts and provide
feedback as a SME.
The Functional Team for <Test Phase> for <Project> is <List names of BA>.
Example: The Functional Team for Project A is Johnny Doe from Social Science
Department, Janet Doe from IT Project Deliverable, and Jason Smithy from Booth
School.
4.1.4) Development Team/Technical Team
The Development Team is responsible for Unit Testing all development objects
or configuration changes/updates. As defects are identified, development
resources are assigned to fix the root-cause, unit test the fix and implement
solutions.
The Development Team also drives Performance Testing and Load Testing. In
addition, the team manages and ensures the availability of the Testing
environments to support all phases of testing.
The Development Team for <Test Phase> for <Project> is <List names of
programmer/technical resource>.
Example: Developers for Project A are John Doe from Network Group, Jane Doe
from Email Server Group, and Jason Smith from Security Group.
This document is a copy, effective only on the downloaded date. The official, updated template always resides on the PM101 site.
File Name: Document1
Page 7 of 14
Last saved: 3/14/16
TESTING STRATEGY
[PROJECT NAME]
4.1.5) SMES and Super Users
Subject Matter Experts (SMEs) from the Business Units, typically those who have
not been involved in any testing phases, can participate in User Acceptance
Testing to ensure that the solution meets the needs of the business and that
there are no “show stoppers” that would prevent the system from going live and
to provide another set of eyes to validate the system.
If you have decided that UAT will be executed for your project, check with the
business sponsors to identify a group who will participate in UAT. Testers will
need to provide test scenarios for SMEs to check the system.
UAT will be conducted by <Department A – it’ll be hard to identify individuals to
participate in the beginning. So identify the group and add only the group name
in this section.>.
4.2) Test Reports
Status reports will be used to track the progress of test completion. These reports will
include:

Test Defect Report: This report shows list of defects and their statuses. Use this
template if you track defects manually. Any automated tool will provide a tool to
show the number of open and closed defects.

Test Status Report: This report indicates the test phase, test script status,
priority, executed vs. planned, and the tester responsible for execution or retesting as well as defect tracking. Utilize PM101-Master Test Plan template with
defect numbers added on the comment section for cross-checking.
5) Entry and Exit Criteria
List Entry and Exit Criteria for each test phases in this section. Take a look at the ‘entry and exit
criteria sample document’ for a reference.
Example:
Entry Criteria for Unit Testing:

Build is ready.

Master/Sample Data is loaded.
This document is a copy, effective only on the downloaded date. The official, updated template always resides on the PM101 site.
File Name: Document1
Page 8 of 14
Last saved: 3/14/16
TESTING STRATEGY
[PROJECT NAME]
Exit Criteria for Unit Testing:

All the scripts are executed and results documented.

All the major functions are working correctly and data accurately processed.

All the defects found during the testing are entered in a defect management tool.

All critical, high and medium defects are resolved.

Master Test Plan is updated, as needed.
6) Test Environment and Tools
List the dedicated test environments here.
Example: Unit Testing for Project A will be tested against projectaunit.uchicago.edu.
Integration Testing for Project A will be tested in its own environment at
projectaint.uchicago.edu.
6.1) Tools
List any tools that will be used for your testing.
Examples:

Test Data Generation: Reference data will be loaded using <tool used>.

Test Scripts will be executed automatically using <tool used>.

Performance (Stress and Volume/Load) Testing will be conducted using <tool
used>.

Defect Tracking: Defects will be tracked using <tool used>.
7) Defect Management Process
The goal of a defect management process is to minimize the resolution time for problems by
logging, tracking, and expediting problems as they occur, keeping stakeholders current as to
This document is a copy, effective only on the downloaded date. The official, updated template always resides on the PM101 site.
File Name: Document1
Page 9 of 14
Last saved: 3/14/16
[PROJECT NAME]
TESTING STRATEGY
resolution status, exploring all factors that can lower mean time to resolution (MTTR) and
maintain a high level of overall customer satisfaction.
This is true whether the defect management process supports a development environment or a
production environment. The following section describes the standard process to be used
during Testing.
Feel free to update any section of Defect Management process to document agreed process for
your project.
7.1) Defect Tracking Workflow
Report Defect
Review/Assign
New Defects
Fix Defects
Retest
Analyze
Defect Data
When a deviation from expected results is encountered during execution of a test, a
defect is reported.
A Technical Lead will review, prioritize and assign a new defect to a developer or
configuration resource for resolution.
The developer or configuration resource identifies the cause of the defect, modifies the
code or configuration and performs unit testing of the change.
The testing continues through the standard levels (unit, integration, etc.).
7.2) Severity/Priority
Use the following definitions to determine the severity of a defect identified
during testing:
Re-define each severity level and its Service Level Agreement (SLA) according to
your project. What’s listed below are common definitions and SLA.
Severity
Definition
Response
SLA
Resolution
SLA
Critical
•A functional failure that cause the system
to crash or has been defined as critical to
the overall operation of the system
•An immediate fix is needed. Test
execution cannot commence for all testers.
3 hours
6 hours
This document is a copy, effective only on the downloaded date. The official, updated template always resides on the PM101 site.
File Name: Document1
Page 10 of 14
Last saved: 3/14/16
TESTING STRATEGY
High
[PROJECT NAME]
•Issue would have an adverse impact on
the functionality of the item under
examination.
•Test scenario cannot be satisfactorily
tested and results may be inconclusive.
•Scenario execution is stopped. Fix
required prior to deployment.
4 hours
1 day
Medium
•Testing may continue at the discretion of
the tester.
•Fix required prior to deployment.
As time
permits
As time
permits
Low
•No impact to the functionality; cosmetic
change
•A fix need not be provided prior to
deployment, but should be discussed by the
project team to decide on the scheduled
delivery of the proposed fix.
•A workaround exists
As time
permits
As time
permits
7.3) Reporting Defects
When the actual results or behavior is not as expected, the defect should be logged in
<tool you decided on>. If defect is tracked manually, then change the verbiage to ‘When
the actual results or behavior is not as expected, the defect should be reported to <Test
Lead or a person tracking defects>.”
Defects should be recorded with the following information:

Defect #

Defect severity (Critical, High, Medium, Low)

Opened by (originator)

Test Phase (Unit, Integration)

Test Environment/Build/Patch information
This document is a copy, effective only on the downloaded date. The official, updated template always resides on the PM101 site.
File Name: Document1
Page 11 of 14
Last saved: 3/14/16
TESTING STRATEGY

Defect title or short description

Detailed problem or defect description
[PROJECT NAME]
o Test scenario/case/procedure/step identifier
o Test user information
o What the expected result is
o What the actual result is
If no automated system is available for tracking defects for your project, use the PM101Issue-Risk template.
7.4) Reviewing and Assigning Defects
After the defect is logged in <tool> or reported to <person>, the defect will be reviewed
by <Test Lead> or <Defect Manager>. The <Test Lead> or <Defect Manager> will validate
the priority and assign the defect to the appropriate Development or Functional team
member. Critical defects (show stoppers) will be immediately forwarded to a
Developer/Configuration resource for resolution.
7.5) Fixing Defects
The Configuration resource or Developer identifies the cause of the problem,
incorporates the fix and executes unit testing to confirm the problem is resolved. Once
unit testing is completed, the following minimal information is recorded, and the defect
is reassigned to the Tester (Originator):

Date fixed

Fixed by

Description of fix

Classification of resolution (configuration change, program change, data error,
other)
7.6) Retesting
After a fix is provided by developers, the tester should retest the conditions that caused
the defect to ensure the functionality is working. Depending on the defects and impact
This document is a copy, effective only on the downloaded date. The official, updated template always resides on the PM101 site.
File Name: Document1
Page 12 of 14
Last saved: 3/14/16
TESTING STRATEGY
[PROJECT NAME]
across applications, the tester must consider how much regression testing to perform to
ensure that fixes have not broken previously tested functionality.
When retest has validated that the problem is fixed, the defect is closed with the
following information:

Date closed

Verified by

Comments (allowing for recording info such as what steps were taken to retest
and/or what additional defects were opened as a result of this fix)
If during retest it is determined that the problem is not resolved, the defect will be
reopened and returned to the Developer or Configuration resource. If the problem is
resolved, but other deviations occur, a NEW defect will be opened.
Anyone can open a defect, but only the Originator, Test Lead or a Person managing
defects should close the defect.
7.7) Analyzing Defect Data
Analyze the open defects to decide if you can move out of the test phase and/or deploy
the product into production.
Re-prioritize the defects as you near the end of testing each phase and ensure there’s a
plan documented as to how each will be handled/fixed. If there are workarounds for
the open defects, they should be documented prior to completion of all test phases.
This document is a copy, effective only on the downloaded date. The official, updated template always resides on the PM101 site.
File Name: Document1
Page 13 of 14
Last saved: 3/14/16
8) Document Signoff Page
Phase: Build
This Testing Strategy document [Version x.x] has been reviewed and found to be consistent with the
specifications and/or documented project requirements. The signature below documents acceptance of this
document and/or work product by the signing authority
Organization: University of Chicago________________
Contractor________________
Approved by:
Signature: ___________________________________________________________________
Name: ______________________________________________________________________
Title:
Date:
Organization: University of Chicago________________
Contractor________________
Approved by:
Signature: ___________________________________________________________________
Name: ______________________________________________________________________
Title:
Date:
Organization: University of Chicago________________
Contractor________________
Approved by:
Signature: ___________________________________________________________________
Name: ______________________________________________________________________
Title:
Date:
Download