Approvals

advertisement
LOGO
Document’s Title:
<Document Title>
Project Name
Author:
<Author Name>
Project Name
Software Master Test Plan
<Product Version>
Author
<Author name>
Table of revisions:
Version
Date
Approvals:
Role
Created
<Creation date>
Last Modified
<Modification date>
Description
Date
By
Signature
LOGO
Project Name
Document’s Title:
<Document Title>
Author:
<Author Name>
Table of Contents
1.
Introduction................................................................................................................................... 2
2.
Project scope ................................................................................................................................. 4
3.
Testing Approach/Strategy ........................................................................................................... 4
4.
Test environment ........................................................................................................................ 10
5.
Open Issues, Risks and Assumptions........................................................................................... 11
6.
Stakeholders ................................................................................................................................ 12
1. Introduction
<State the purpose of the Plan, possibly identifying the level of the plan (master etc.). This is
essentially the executive summary part of the plan. Identify the Scope of the plan in relation to the
Software Project plan that it relates to. Other items may include, resource and budget constraints,
scope of the testing effort, how testing relates to other evaluation activities (Analysis & Reviews),
and possible the process to be used for change control and communication and coordination of key
activities. As this is the “Executive Summary” keep information brief and to the point>
1.1 Objective
<This section should specify the objectives of the new product version, and will answer the
question what the new content provides>
1.2 References
<This section will provide links to relevant documents and requirements, such: Functional
specs, Design documents, instructions for required installation.>
Document
MRD
Tech spec
Func spec
…
Link
Comments
LOGO
Project Name
Document’s Title:
<Document Title>
Author:
<Author Name>
1.3 Glossary
<This section will provide list of acronyms and terms that in use in the document and their
description.>
Term
Description
Requirement
Feature
what product management has listed in the PRD
the product capability that contributes to the solving of the
problem/fulfillment of the requirement
user centric description of a unit of work that contributes to a
feature
time to investigate and prototype
User story
Design spike
1.4 Schedule
<Should be based on realistic and validated estimates. If the estimates for the development of
the application are inaccurate, the entire project plan will slip and the testing is part of the
overall project plan>
9/27/2010
Kick-off
9/27/2010 - 10/15/2010
Sprint 0
11/1/2010 - 11/12/2010
11/29/2010 - 12/10/2010
10/18/2010 - 10/29/2010
11/15/2010 - 11/26/2010
Sprint 2
Sprint 4
Sprint 1
Sprint 3
10/1/2010
11/1/2010
12/1/2010
9/27/2010
12/31/2010
10/14/2010
Retrospective
12/31/2010 - 1/9/2011
Holidays
10/28/2010
Retrospective
1/10/2011 - 1/28/2011
TBD
1/31/2011
RTM
1/1/2011
12/31/2010
12/30/2010
Feture freeze
2/1/2011
1/14/2011
Code Freeze (date TBD)
1.5 Milestones
Task / Milestone
<task>
12/13/2010 - 12/30/2010
Sprint 5
Due Date
Expectation
LOGO
◊
◊
◊
◊
◊
◊
◊
<milestone>
◊
RTM
Project Name
Document’s Title:
<Document Title>
Author:
<Author Name>
<Beta>
<UI freeze>
<Code Freeze>
2. Project scope
2.1 Project Overview
<Briefly describe main projects’ parts/domains affected by current release version, list main
directions and high level content overview>
2.2 Scoped Items
<Scope of features to be tested as part of this release (domain wise)>
2.3 Out of Scope
<This is a listing of what is NOT to be tested from both the Users viewpoint of what the system
does and a configuration management/version control view. This is not a technical description
of the software, but a USERS view of the projects’ functions. Identify WHY the features are not
to be tested, there can be any number of reasons.
 Not to be included in this release of the Software.
 Low risk, has been used before and is considered stable.
 Will be released but not tested or documented as a functional part of the release of
this version of the software>
3. Testing Approach/Strategy
3.1 Process Overview
<Briefly describe a process to be followed during this release, with its main roles, delivery
cycles and their duration, high level breakdown of activities within the cycles like planning,
LOGO
Project Name
Document’s Title:
<Document Title>
Author:
<Author Name>
reviews, execution, definition of done for cycles, reporting, tracking tools (for tests
execution/results and defects), defects workflow, RC validation and product implementation
process etc .
If there going automation to be used, the main approach of that agenda should be described
as well.
Communication channels and types is part of this section - describe main meetings (f.e. cross
functional sync, stand-ups, different sync-ups, tech design review, backlog grooming, etc ),
peers and techniques to be used during the release development>
3.2 Requirements
<This is a short description of requirements nature, their types (e.g. product backlog, sprint
backlog), defined way of documenting and tracking them, reqs owners, QA role in dealing
with them>
3.3 Entry and Acceptance Criteria
<Section which contains bunch of information about entry items for starting following the
plan (special environment delivery, resources readiness, approvals, etc) as well as exit criteria
(pass/fail) for the scoped work of the plan (quality criteria for general release, acceptance test
results(requirements), timeline fits, etc).
The quality criteria may be divided onto lower level items like functional, non-functional
(security, performance/load, usability, etc), upgrade, etc>
3.4 Testing types
<List of testing types to be applied for the entire release. Different product domain main use
different types – and this should be clearly specified in current section>
Functionality
<These tests ensure the Functional quality for the software. Here you will list the test cases
that QA will perform in order to ensure the application meets the requirements and
expectations as described in the functional spec docs. The specification level is product
domains or product initiative>
UI
<Cross-product standards to be followed>
What to test
How\where to test
Screen resolutions
Spell check
1024x768, 1280x1024
LOGO
Project Name
Document’s Title:
<Document Title>
Author:
<Author Name>
Consistency
Accuracy
<These tests ensure that application works accurately and supply the user correct information
Example:
 Test the calculation of different goals in dashboard: number of vusers, transactions
response time, hits per seconds>
Calculation
Expected results
Concurrency
<Multi-users testing will attempt to prove that it is possible for an acceptable number of users
to work with the system at the same time
Examples:
1. Save two projects with same name in the same time (from different instances)>
Test case
Dependency
Tested operation
I18N
<This section gives description of I18N focuses of testing the entire product. It combines all
specific types of test cases (Functional, UI, accuracy..) which should be tested from I18N point
of view in this release>
Focus area
Using of 18N
characters in
possible input fields
Description
1. Filters
2. Load Test name
3. Script name
4. Etc…
UI correctness
…
Security
<This section gives description product testing from security point of view like SQL injections,
illegal actions etc… >
LOGO
Focus area
Filters
Project Name
Document’s Title:
<Document Title>
Author:
<Author Name>
Description
1. insert illegal characters
in the filters fields
2. try to inject malicious
input
…
Integrations
<This section gives description of integration of product domains (if valid) between each
other>
Domain
Integration Operations
Expected
…
Usability
<This section gives description of generic product usability testing like usable UI,
understandable messages which application provides>
Object
Error messages
Description
1. understandable
2. reasonable
3. informative
…
Performance
<These tests types ensure that the system provides acceptable response times.
Test should check the influence of viewers/ users on the application performance
Example:
Copy projects that contain different scales of data ( scripts, monitors, load tests)>
Scale
Configured scenario
Response time
Small
Medium
Large
Stability
LOGO
Project Name
Document’s Title:
<Document Title>
Author:
<Author Name>
<This section ensures stability over time and stability under stress tests of the entire system.
In the stability over time you will define 3 test cases that will be tested in the IRONSKY
environment, and in the stability under stress you will define the scenarios that will be
tested.>
Stability over time (test cases that will be tested in the IRONSKY environment):
Test cases
Description
Number of users involved
Stability under stress:
Stressed resource
Memory
CPU
Disk space
Network
Tested scenario
Backward Compatibility
<This section gives description of feature backward compatibility with older versions for the
entire product>
Object/domain
SLA goals &
Project SLA
statuses
Compatibility with version
Expected
1. after migration from
1. Should exist
9.0
2. Should not exist
2. after migration from
8.1 FP4
…
Documentation
<This section ensures entire product features are documented and have full explanation in
the Help: User\Admin\installation guide, online help, and readme file. Specify the location
that will be tested>
Feature name
Location
Other
LOGO
Project Name
Document’s Title:
<Document Title>
Author:
<Author Name>
<Specify any other testing types which are applicable for your particular project>
3.5
Errors Types
<This section describes types of errors to be vrified during the testing as part of Error
handling testing type. Just choose from the list below or/and add what you have in addition
and missing in the list:
 Incompliance with the functional requirements;
 Errors of boundary points handling;
 The errors of initialization;
 Initial and subsequent state-related errors;
 Input data handling errors;
 Errors caused by interference;
 Errors in handling on-the-fly modifications,
 Race condition handling errors;
 Errors invoked by negative tests;
 Errors in error-situation handling;
 Components interaction errors;
 Interface errors;
 Memory leaks>
3.6
Technical aspects
< This section provides some technical aspects which we should pay attention to like:
What are the changes (product domains level)?
3.7
COST
< This section provides some direction to Customer Oriented testing:
Which customers affected the most by the new product content?
Which customer schemas are better to use during testing and how customers are going to
use the new delivering content?>
3.8
Deliveries
<This section will provide planned delivery of Dev and QA testing plan >
Dev Planned delivery
Functionalities to be delivered
(domain wise)
Milestone
Build date
LOGO
Document’s Title:
<Document Title>
Project Name
Author:
<Author Name>
QA Planned delivery
[P1= High, P2= Medium, P3= Low, P4= Nice to have]
Timeline
Scope (domain
wise)
Type of test
(sanity, full,
E2E, pair, etc)
Priority
Duration
Start/end date
Cycle x
Cycle x+1
FF
CF
RC
4. Test environment
<Specify the environment that needs to be tested, make sure this information will be considered in
the planned environment certification plans>
4.1 Tested Software
<Specify the planned topologies, operating systems, languages, and data bases that will be
tested.
List of OS: Win 2000 server, Win 2000 advanced server, Win XP, Win 2003 enterprise.
List of languages: English-US, English-UK, French, German, Polish, Swedish, Kanji.
DB type: Oracle, MS-SQL>
Topology
OS
Servers
DB type
Hosts
Language
LOGO
Document’s Title:
<Document Title>
Project Name
Author:
<Author Name>
4.2 Installations and Applications
<Specify additional installations and applications that that will be in use,
f.e. SunOne configured over SSL>
5. Open Issues, Risks and Assumptions
5.1 Open Issues
<Highlight open items specifying details, priority and owner >
5.2 Risk Analysis
< Specify the risks, which may cause a delay in testing plans of the project. Items to consider
are resources availability, required configurations or installation, etc>
Environment
Product
New features
Hardware
Code refactoring
Network
Software
Compatibility and
integration
Quality and time
impediments
Lack of information
Scalability of
resources
Personal issues
Process
People
LOGO
ID
Project Name
Risk
Type
Document’s Title:
<Document Title>
Implications
Author:
<Author Name>
Probability
(1-5)
Severity (15)
Total
Risk
Mitigation
(Alternative) plan
1
2
3
5.3 Assumptions
<List of assumptions considered for current release version. Some examples:
 Functional Designs are done upfront
 Functional Designs are granular enough to produce business requirements, functional
backlog and technical backlog (instead of FS)
 Each product backlog has a High level tech design and estimate with it
 Each use case has a minimal acceptance path, “good to have” path
 Better QA & Dev synergies>
6. Stakeholders
<List of stakeholders with title and org>
Name
Title
James Brown
Org
QA manager
QA
Download