Test Case Template

advertisement
[INSERT PROJECT NAME]
TEST CA SE
[Insert Test Case Name]
Current Version:
<current version number>
Date Last Updated:
<last update date>
Last Updated By:
<update author name>
Author:
<author name>
Date Created:
<creation date>
Approved By:
<approving body>
Approval Date:
<date of approval>
<Project> Test Case <Test Case Name>
REVISION HISTORY
REVISION NUMBER
DATE
COMMENT
1.0
September 4, 2007
Original DoIT PMO Document
Page 2 of 8
<Project> Test Case <Test Case Name>
Table of Contents
REVISION HISTORY ................................................................................................................................... 2
TABLE OF CONTENTS ................................................................................................................................ 3
INSTRUCTIONS: ........................................................................................................................................ 4
1
TEST CASE PROFILE.......................................................................................................................... 4
2
TEST CASE SETUP............................................................................................................................. 4
2.1
2.2
2.3
2.4
TESTING ENVIRONMENT ......................................................................................................................... 4
RESOURCES REQUIRED ........................................................................................................................... 5
PREREQUISITE TESTS .............................................................................................................................. 5
REGRESSION TESTING............................................................................................................................. 5
3
TEST RUN LOG ................................................................................................................................. 5
4
TEST RUN RESULTS .......................................................................................................................... 7
4.1
4.2
RESULTS SUMMARY ............................................................................................................................... 7
DISCREPANCY LOG ................................................................................................................................ 8
Page 3 of 8
<Project> Test Case <Test Case Name>
INSTRUCTIONS:
The red type is intended to be for instructions on how to use this template. As it is filled
out the red text should be deleted or changed to black type if any of the text is to remain
for background or explanatory purposes.
1 TEST CASE PROFILE
Test Case ID:
Test Case Name:
Test Case [Examples of test objectives to enter here are:
Objectives:
 To verify user interface as specified in design document

To test usability as stated in requirement ID ……..

To verify installation instructions

To test ability to perform at specified capacity

To test exception handling under the following
conditions…….]
Description: [Enter a brief description of the test to be performed.]
Requirements [Refer this test case to specific requirements addressed. Also register
Addressed: this test case in the Requirements Traceability Matrix.]
Success/Fail [Describe the success/fail criteria in terms of the results expected upon
Criteria: running this test case.]
2
TEST CASE SETUP
2.1
TESTING ENVIRONMENT
[Refer to the related Test Plan for a description of the overall test environment
(hardware platforms, software applications, etc.).

Page 4 of 8
List any special additions or changes to the environment (e.g., special
hardware, special facilities, special software) that must be implemented in
order to run this test case.
<Project> Test Case <Test Case Name>

2.2
List special conditions or states that must exist in the environment for this
test case (e.g., database initialized to a minimum set of test data).]
RESOURCES REQUIRED
[Refer to the related Test Plan for a description of the overall resources required for
the tests; List any special additions or changes to the environment that must be made
in order to run this test case.]
2.3
PREREQUISITE TESTS
[Identify the test cases that must have been executed successfully before this test case
may be executed.]
2.4
REGRESSION TESTING
[Regression testing ensures that a new system build has not regressed, or lost
functions, that were present in the previous build. Identify additional testing to be
done when the system has been rebuilt to fix an error discovered by prior execution of
this test case. These tests are in addition to regression testing identified in the related
test plan. Along with the Prerequisite Tests, this ensures that the fix and rebuild of
the system did not introduce problems elsewhere in the system or cause loss of
functionality.]
3 TEST RUN LOG
[Instructions to test developer: List the steps for executing this test case. A table
consisting of a block of steps is provided for each series of steps, ending with a
description of expected results. Add or remove rows in the table to accommodate the
number of steps specified. If interim results are to be recorded, followed by a continuation
of the test case steps, use the second block provided in the template below. Add or
remove blocks of steps as needed.]
Page 5 of 8
<Project> Test Case <Test Case Name>
Instructions to tester: Record status upon completion of each step (e.g., “OK” or “NG”
with comments). Opposite “Actual Results”, enter “OK” if actual met expected results, or
note discrepancies if appropriate.
Test Run Log
Test Case ID:
Test
Steps
Test Case Name:
Tester Action
[ID.1]
[Describe a step to be performed by tester]
[ID.2]
[Describe a step to be performed by tester]
[ID.3]
[Describe a step to be performed by tester]
[ID.n]
[Describe a step to be performed by tester]
Status
(tester)
Expected Results: [Describe response or results expected at this point in the test]
Actual Results
(tester):
Test
Steps
Tester Action
Status
(tester)
[ID.n+1] [Describe a step to be performed by tester]
[ID.n+2] [Describe a step to be performed by tester]
[ID.n+3] [Describe a step to be performed by tester]
[ID.m]
[Describe a step to be performed by tester]
Expected Results: [Describe response or results expected at this point in the test]
Actual Results
(tester):
Page 6 of 8
<Project> Test Case <Test Case Name>
Test
Steps
Tester Action
[ID.m+1
]
[Describe a step to be performed by tester]
[ID.m+2
]
[Describe a step to be performed by tester]
[ID.m+3
]
[Describe a step to be performed by tester]
[ID.m+n
]
[Describe a step to be performed by tester]
Status
(tester)
Expected Results: [Describe response or results expected at this point in the test]
Actual Results
(tester):
4 TEST RUN RESULTS
4.1
RESULTS SUMMARY
Instructions to tester: Record whether this test case passed or failed. Summarize the
actual results relative to the expected results.
Test Run Results Summary
Pass/Fai
l Status
Page 7 of 8
Summary of Actual and Expected Test Results
Additional
Comments
(Optional))
<Project> Test Case <Test Case Name>
4.2
DISCREPANCY LOG
Instructions to tester: Log discrepancies between expected and actual test results. For
discrepancies that warrant initiating a Problem Report, record a brief statement of the
problem under “Disposition”, and enter the problem report in the Project’s problem
tracking system.
Discrepancy Log
Description
Problem
ID
Page 8 of 8
Disposition
Download