Comprehensive Test Plan - The University of Texas at Austin

advertisement
Project Name
Document Version 2.0
Prepared by Jane Doe, ITS
Last Edited February 9, 2016
PLAN
REQUIREMENTS
SOLUTION
ANALYSIS
DESIGN
BUILD
TEST
TRAIN/DEPLOY MAINTENANCE
TO THE DOCUMENT OWNER: This template is provided as a guideline and resource. The structure and instructions
give detail about what might go into a completed document. Only you and your team, however, know what will
best fit the needs of your specific effort, and you are encouraged to adapt the template as appropriate to meet
those needs.
Comprehensive Test Plan
The Comprehensive Test Plan describes the overall test approach for the project. This includes a list of all
types of testing to be performed during the project and a matrix of which tests will be performed in
which phases, the tools that will be used for testing, the methods that will be used for documenting test
results, and the personnel who will be involved in testing. As the test strategy document, the
Comprehensive Test Plan does not include specific test cases or test criteria; these are documented in the
Functional Test Plan and Non-Functional Test Plan. However, the Comprehensive Test Plan does define
the criteria that must be met before each type of testing may begin and the criteria which must be
passed for each type of testing to be considered complete.
The audiences for the Comprehensive Test Plan include:
 The project team -- Reviews for completeness and feasibility
 Other groups which will be involved in testing -- Provide input on and acceptance of high-level
plan
 Customer steering committee -- Provides sign-off on high-level plan and agreement to plans for
communicating test results
Executive Summary
Describe the high-level approach to testing for the project. Include a high-level description of the test
objectives, test scope, and approach to testing adopted by the project team. Identify documents related
to this test plan (for example, a Project Charter or Requirements and Traceability Workbook). Include
any references to other plans, documents or items that contain information relevant to this project.
Summarize the approach, process, assumptions, dependencies and risks. Also discuss how the
Communication Plan will address the audience of the testing results.
Scope of Test
What is being tested? Does testing affect multiple products or platforms? Provide a general overview
that can be further described with Features to be Tested and Features not to be Tested.
Page 1 of 9
Project Name
Document Version 2.0
Features to be Tested
Features are capabilities of the system from the user’s viewpoint. This section is a high-level view of the
features to be tested and should refrain from being a technical breakdown of the system. Include
reference the requirements documentation and/or project phase specifications.
.
Features not to be Tested
What is not to be tested can be sometimes just as important as stating what is to be tested. It removes
any ambiguity in order that other project stakeholders are clear on what to expect from the test phases.
State clear reasons why the feature is not being tested. There could be any number of reasons and all
should be given, along with the mitigating factors.
Testing Strategy
Different projects will require different types of testing. Choose the appropriate types of testing for your
project, as outlined in the following sections. For each type of testing, describe the test scenarios, scripts,
or use cases, as well as the expected outcome. Details of use case and test case mapping are expected to
be mapped in the Requirements and Traceability Workbook, so reference to that workbook and its use
are expected in this section. Please make note in this paragraph of any types of testing that will not be
done for this project and the reason(s) for the decision(s). Larger testing types will likely require
independent testing documents; pointers to those documents in the sections below are appropriate. Will
rapid-test techniques be used to get early assessments of software stability? Remember to consider
Lessons Learned reviews from previous projects.
Unit Testing
Unit Testing will be done by the developer responsible for that unit of code. Describe how unit testing will
be done. Exit Criteria should be included. Where this is tracked needs to be documented. Document
where test scripts and execution results are stored.
Unit Test Phase Entry Criteria




100% of unit tests have been peer-reviewed
Software to be unit tested has been checked into configuration management system
All planned functionality and bug fixes have been implemented
Source code for software to be unit tested has been peer-reviewed
Unit Test Phase Exit Criteria






Test Plan
100% of unit tests are executed
95% of unit tests pass
Unit tests and unit tested software have been checked into configuration management
system
Unit tested software is available for next test phase
Less than n outstanding Minor or Trivial severity issues
Less than n outstanding Major severity issues, all with action plans
Page 2 of 9
Project Name
Document Version 2.0


Zero outstanding Critical or Blocker severity issues
Code documentation is complete; designs are updated as necessary
Functional Testing
There are different types of tests that are used to validate that software performs the intended function
with the anticipated features. These types of tests include Functional/Component testing, System and
Integration testing, Recovery and Error Handling testing, Regression test sets, and Accessibility testing.
While the sections below document the entry and exit criteria and can document the approach taken, it
is anticipated that details of functional testing will be described in a separate Functional Test Plan for the
project. If that is the case, please make note of that here.
Functional/Component Testing
This is to verify that the product meets its functional/feature requirements. Describe the functional
testing for the project or point to the appropriate Functional/Component test plan. Documentation
should include Entry and Exit Criteria, and specific environment needs, tools, assumptions and
dependencies, as well as any necessary staging based on functionality delivery dates. (This section is not
designed to be a complete Functional Test Plan. It is anticipated that there be separate documents
depending on the overall size of the project.) Document where test scripts and execution results are
stored.
Functional/Component Test Phase Entry Criteria
 Functional/Component test documentation/scripts have been peer-reviewed
 Software to be tested has been checked into configuration management system
 Test data completed
 Test environment completed
Functional/Component Test Phase Exit Criteria
 100% of functional/component tests are executed
 n% of functional/component tests pass
 Test Exit Report has been approved
 Tested software has been checked into configuration management system
 Tested software is available for next test phase
 Less than n outstanding Minor or Trivial severity issues
 Less than n outstanding Major severity issues, all with action plans
 Zero outstanding Critical or Blocker severity issues
System and Integration Testing
System and Integration Testing pulls in all the functional components of a project, integrating into the
complete solution, including testing of all interfacing systems. Aspects of concurrency as appropriate
should be included in this test. Describe the type of system and integration testing that is needed and
how it will be done. This should include testing to assure that all functions work under all combinations
Test Plan
Page 3 of 9
Project Name
Document Version 2.0
with all necessary software versions. Please don’t forget data migration, if applicable, and browser
testing. Documentation should include Entry and Exit Criteria, and specific environment needs, tools,
assumptions and dependencies. Document where test scripts and execution results are stored. (This
section is not designed to be a complete System and Integration Test Plan. It is anticipated that there be
separate documents depending on the overall size of the project.)
System and Integration Test Phase Entry Criteria
 Test documentation/scripts have been peer-reviewed
 Software to be tested has been checked into configuration management system
 Test data completed
 Test environment completed
System and Integration Test Phase Exit Criteria
 100% of the tests are executed
 n% of the tests pass
 System and Integration Test Report has been approved
 System and Integration tested software has been checked into configuration management
system
 System and Integration tested software is available for next test phase
 Interfacing systems have signed off on the test
 Less than n outstanding Minor or Trivial severity issues
 Less than n outstanding Major severity issues, all with action plans
 Zero outstanding Critical or Blocker severity issues
Recovery and Error Handling Testing
This is to confirm that the system recovers with hardware and/or software malfunctions without losing
data or control, or that it follows the defined error handling requirements. Describe recovery and error
handling testing for the project. Many times this is included in the functional testing efforts. If not, repeat
functional entry and exit criteria. Documentation should include Entry and Exit Criteria, and specific
environment needs, tools, assumptions and dependencies. Document where test scripts and execution
results are stored.
Regression Testing
Regression testing is the selective retesting of a system or component to verify that modifications have
not caused unintended effects and that the system or component still works as specified. What is the
regression test approach? Will this be automated or manual? Will there be dedicated resources? Will
tests be added as function is added? How often will the regression set be run? Document where test
scripts and execution results are stored. This type of testing is required for enhancements to existing
systems; it may not be required for introduction of a new application.
Test Plan
Page 4 of 9
Project Name
Document Version 2.0
Accessibility Testing
A system is considered accessible if users using non-standard but supported tools, such as screen readers,
can make use of the system’s services. University web pages must conform to the university web
accessibility policy (http://www.utexas.edu/what-starts-here/web-guidelines/accessibility). Describe
approach, including tools being used. Document where test scripts and execution results are stored.
Document which pages will display the link to the accessibility guidelines.
Non-Functional Testing
There are different types of tests that are used to validate that software behaves in an expected and
acceptable manner in addition to performing the function that was intended. These types of tests
include Performance testing, Load and Stress testing, Security testing, User Acceptance testing, and
other testing as deemed appropriate for a specific project. While the sections below can document the
approach taken, it is anticipated that details of functional testing will be described in a separate
Functional Test Plan for the project. If that is the case, please make note of that here.
Performance Testing
List the performance requirements for the project with how each requirement will be validated. Consider
these aspects when planning: network delay, scalability, data rendering, client side processing, database
transaction processing, capacity, and speed. Should include Entry & Exit Criteria, and specific
environment needs, tools, assumptions and dependencies. Document where test scripts and execution
results are stored. (This section is not designed to be a complete Performance Test Plan. It is anticipated
that there be separate documents depending on the overall size of the project.)
Performance Test Phase Entry Criteria
 Performance test documentation/scripts have been peer-reviewed
 Software to be performance tested has been checked into configuration management
system
 Test environment completed
Performance Test Phase Exit Criteria
 100% of performance tests are executed
 Performance Test Report has been approved by Steering Committee and Project Manager
 All Major, Critical and Blocker severity issues have action plans in place
Load and Stress Testing
This is to verify that the product performs acceptably under load conditions and attempts to break the
product by stressing its resources. Describe the load and stress tests for the project. Should include Entry
& Exit Criteria, and specific environment needs, tools, assumptions and dependencies. Document where
test scripts and execution results are stored. (This section is not designed to be a complete Load & Stress
Test Plan. It is anticipated that there be separate documents depending on the overall size of the
project.)
Test Plan
Page 5 of 9
Project Name
Document Version 2.0
Load Test Phase Entry Criteria
 Load test documentation/scripts have been peer-reviewed
 Software to be load tested has been checked into configuration management system
 Test environment completed
Load Test Phase Exit Criteria
 100% of load tests are executed
 Load Test Report has been approved by Steering Committee and Project Manager
 All Major, Critical and Blocker severity issues have action plans in place
Security Testing
Security testing should consider the following aspects: authentication, authorization, data security, SQL
injection attacks, confidentiality, data integrity, and security regulations. Describe approach, including
involvement of the ISO, who will perform the tests, in what environment, etc. Document where test
scripts and execution results are stored.
User Acceptance Testing
User Acceptance test is usually the last test phase. Successful completion of the test with the resolution
of the issues signifies customer acceptance of the application. Identify who will write the test scripts.
Describe how the testing activity will take place. Document where test scripts and execution results are
stored. (This is not designed to be a complete User Acceptance Test Plan. It is anticipated that there be
separate documents depending on the overall size of the project.)
Acceptance Test Phase Entry Criteria
 Acceptance test documentation/scripts have been peer-reviewed
 Acceptance Testers have been trained (if doing true UAT)
 Software to be acceptance tested has been checked into configuration management system
(this could include documentation and user manuals, etc.)
 Test data completed
 Test environment completed
Acceptance Test Phase Exit Criteria
 100% of acceptance tests are executed
 n% of acceptance tests pass
 User needs 100% validated
 Acceptance Test Report has been approved
 Acceptance tested software has been checked into configuration management system
 Customer has formally approved acceptance of the software into the live environment
 Less than n outstanding Minor or Trivial severity issues
 Less than n outstanding Major severity issues, all with action plans
 Zero outstanding Critical or Blocker severity issues
Test Plan
Page 6 of 9
Project Name
Document Version 2.0
Other Testing
For each project consider Usability testing, Installation testing, Compatibility testing, Documentation
testing, Disaster Recovery testing, and Failure testing. For example, Documentation testing may be
appropriate if there is user documentation or configuration information that needs to be verified.
For each of the applicable types of testing, the entire test approach can be documented in this plan or
can be detailed in the Non-Functional test plan.
Usability Testing: Subjective test of the ease of use of an interface by its target users, as well as testing
those users’ ability to learn to use the interface quickly and effectively. Usability testing include topics
such as consistency of look, feel and tone across user interface pages. This testing includes evaluating
user navigation through the system if appropriate. Documentation should include which interfaces are in
and out of scope for this test. For example, existing interfaces may be out of scope, when new functions
are added.
Installation Testing: Applicable depending on deployment method. Also requires appropriate
documentation testing simultaneously.
Compatibility Testing: List the supported browsers and the appropriate supported platforms, including
mobile devices. Refer to supported browsers (http://www.utexas.edu/what-starts-here/webguidelines/browsers).
Documentation Testing: Any user documentation needs to be reviewed and tested where appropriate.
Does the user documentation describe steps and sequences accurately and clearly?
Disaster Recovery Testing: Is this application critical? If so, describe how Disaster Recovery Testing will be
done. Also document how test results will be published.
Failure Testing: Is the application critical? Should it continue to function even when specific internal
components fail? If that is the case, describe how testing will be done.
Test Data
Plan and document your test data needs, including sources of data, data creation approach and
expected results. Is this something that you can manage within the test team or is it something that will
require external assistance? What about automatic test data generators? If you are using data sourced
from a current live system, then there may be confidentiality aspects that need to be addressed. Will the
test data need to be reset for every cycle of testing? If so, who will do this and how long will it take? If
scripts have to be run to do this, they could mean lengthy runs that could impact your progress.
Test Plan
Page 7 of 9
Project Name
Document Version 2.0
Hardware/Environment Requirements
Specify both the necessary and desired properties of the testing environment – including hardware,
configuration requirements, communications and system software, third party software, mode of usage
(standalone, clustered), other software. Identify any other testing need and potential sources to fulfill
needs of items not currently available.
Tools
List automation tools. List bug tracking tool. Is there a remote characteristic to the testing that requires
special tools?
Dependencies
Identify significant constraints on testing, such as test-item availability, testing-resource availability, and
deadlines. How will personnel availability be managed – e.g., through Project Server? Also identify any
personnel training needs. Identify dependencies – test tools, schedule commitments with code
deliverables, etc.
Risk Assumptions
Identify high-risk assumptions for the test plan. Specify contingency plans for each. Some thoughts about
risks include third party products and services, new versions of interfacing software, ability to obtain and
learn how to use new tools and technologies, multi-site development and test, high risk components,
poor documentation, changing requirements, and government regulations.
Test Schedule
The Test Schedule should be a part of the project plan, and should include resources, prerequisites, and
start/completion dates for each activity. Include a link to the project plan and a brief overview of the test
schedule here. Discuss phases when certain types or tests are being done – for example, when each type
of testing is scheduled.
Test Reporting
Describe the reports that will be used to document the execution and results of each test phase or
explain where these results will be documented. For each of the phases of testing executed about, the
project manager should approve an end of phase report with notification going to all interested parties.
These reports should be itemized in the Communication Plan.
Test Plan
Page 8 of 9
Project Name
Document Version 2.0
Additionally, at the end of all testing, a Test Closure Memo should be submitted for review and approval
by the customer steering committee. This will summarize the end of phase reports previously submitted,
plus additional information as appropriate. The Test Closure Memo template should be used and should
be itemized in the Communication Plan.
Control Procedures
How will bugs be reported? How will change requests to software be handled – e.g., who has sign-off?
How will additional testing requirements be tracked? This should include tester responsibilities for bug
reporting and developer responsibilities for bug fixing and defect severity definitions.
Appendices
Include any relevant appendices.
Revision History
Identify document changes.
Version
Date
Updater Name
V1
Description
Initial draft completed.
Signatures
Formal written signoff is preferred for larger, more complex projects.
Name
Test Plan
Role
Signature
Date
Page 9 of 9
Download