Test Approach - NextGen Michigan

advertisement
University of Michigan
NextGen Michigan Program
<<Project Name>>Test Approach
& Program Testing Methodology
1
Confidential and Preliminary. For use within University of Michigan only.
Table of Contents
Introduction ................................................................................................................................................ 3
Overview ................................................................................................................................................. 3
Test Planning .............................................................................................................................................. 3
Testing Types .......................................................................................................................................... 3
Testing Types performed by <<Insert Project Name>> ...................................................................... 5
Developing a Testing Schedule ............................................................................................................ 5
<<Insert Project Name>> Testing Schedule .......................................................................................... 6
Test Preparation ......................................................................................................................................... 7
Testing Structure Overview .................................................................................................................. 7
Test Scenarios, Conditions & Expected Results Deliverable ............................................................ 7
Test Scenarios & Execution Status .................................................................................................... 8
Test Conditions ................................................................................................................................... 8
Test Scripts ........................................................................................................................................... 8
Using Requirements & Use Cases to Write Test Scenarios & Conditions ...................................... 8
Test Environments & Data .................................................................................................................... 9
<<Insert Project Name>> Test Preparation Activities ........................................................................ 9
Test Execution........................................................................................................................................... 10
Pass Strategy ......................................................................................................................................... 10
<<Insert Project Name>> Pass Strategy .............................................................................................. 10
Test Incident Management .................................................................................................................. 11
Test Incident Information ................................................................................................................ 11
Test Incident Priority ........................................................................................................................ 13
Test Incident Status........................................................................................................................... 13
Test Incident Reviews ...................................................................................................................... 13
.................................................................................................................................................................... 14
<<Insert Project Name>> Test Incident Tracking .............................................................................. 14
Test Monitoring ........................................................................................................................................ 14
Test Tools ............................................................................................................................................... 14
Progress Reporting & Metrics............................................................................................................. 14
<<Insert Project Name>> Test Monitoring ......................................................................................... 15
Appendix: EUS Testing Process Flow ................................................................................................... 16
2
Confidential and Preliminary. For use within University of Michigan only.
Introduction
Overview
The purpose of this deliverable is to define the approach for the <<insert project name>> project
to plan, prepare, execute and monitor testing activities. This document includes both the
recommened methodology for the Program, as well as the specific approach the project is
utilizing. To complete this document, projects should fill out the blue forms in each of the
sections after reviewing the corresponding methodology. The information the project should fill
in is indicated in blue italizced text.
This approach is organized into four sections, which describe the specific activities that should
occur during each stage of the Test Phase. The four stages of the test phase are described below:
1. Test Planning: Test Planning is the initial stage of Testing in which the plans for the rest
of the phase are created. A Test Approach (similar to this document) is developed to
serve as a master blueprint to facilitate the testing effort. A master testing timeline is also
developed to establish the schedule for testing.
2. Test Preparation: The Test Preparation stage primarily consists of writing detailed test
conditions and expected results. In addition the Test Environment is established and test
data populated during the Test Preparation Stage.
3. Test Execution: Once all of the test preparation is complete, the tests are executed. The
actual results will be verified against the expected results. Problems identified during
test execution will be captured as test incidents. The test incidents will be fixed and
corresponding test conditions will be executed again to validate the fix.
4. Test Monitoring: Throughout each part of the Test Phase specific activities will have to
be completed to manage testing. These activities include reporting testing progress and
managing test tools.
Each of these stages of the Test Phase is described in further detail in the corresponding sections
below.
Test Planning
This section describes the processes and activities for the Test Planning stage.
Testing Types
The following table describes the typical types of testing performed, how they are performed
and the exit criteria to complete each type of testing.
3
Confidential and Preliminary. For use within University of Michigan only.
. Type of
Testing
Exit Criteria
Description & How Performed
Unit
Technical Team Members compare each component
of a solution to the design document to ensure the
component meets the design
Functional
Quality Assurance (Testers or BSAs) uses test
conditions and scripts to verify that a solution has
met its functional requirements.
Load/
Performance
Quality Assurance (Tester, sometimes technical
team) uses test conditions and scripts to ensure
load/performance technical requirements are met.
Regression
Quality Assurance (Tester, sometimes BSA) use reusable test plans (if exists) to ensure that the new or
enhanced solution satisfies the functional, technical,
and business requirements of prior releases.
Integration
Quality Assurance (Tester, sometimes BSA) use test
conditions and scripts to ensure that a solution
works with other solutions to perform an end-to-end
business process that meets a business requirement.
User
Acceptance
A subset of the user community performs day to day
activities or specific business processes to ensure
the solution meets their needs and expectations.
4
 All components have been tested against design documents
 All test conditions have been successfully executed.
 All identified Critical and High Test incidents have been corrected and
retested.
 Any unresolved Test incidents signed off by the Project Owner.
 Any unresolved Test incidents have been logged as known errors
 All test conditions have been successfully executed.
 All identified Critical, High and medium Test incidents have been corrected
and retested.
 Any unresolved Test incidents signed off by the Project Owner.
 Any unresolved Test incidents have been logged as known errors
 All test conditions have been successfully executed.
 All identified Critical, High and medium Test incidents have been corrected
and retested.
 Any unresolved Test incidents signed off by the Project Owner.
 Any unresolved Test incidents have been logged as known errors
 All test conditions have been successfully executed.
 All identified Critical, High and medium Test incidents have been corrected
and retested.
 Any unresolved Test incidents signed off by the Project Owner.
 Any unresolved Test incidents have been logged as known errors
 All test conditions have been successfully executed.
 All identified Critical, High and medium Test incidents have been corrected
and retested.
 Any unresolved Test incidents signed off by the Project Owner.
 Any unresolved Test incidents have been logged as known errors
 User has approved.
Confidential and Preliminary. For use within University of Michigan only.
Testing Types performed by <<Insert Project Name>>
The following table outlines the types of testing the <<Insert Project Name>> will be performing.
Type of
Testing
Project
Performing?
Unit
Yes/No
If No, Rationale/ approach to mitigate
risk
If No, Rationale/ approach to mitigate risk
Functional
Yes /No
If No, Rationale/ approach to mitigate risk
Load/
Performance
Yes /No
If No, Rationale/ approach to mitigate risk
Regression
Yes /No
If No, Rationale/ approach to mitigate risk
Integration
Yes /No
If No, Rationale/ approach to mitigate risk
User
Acceptance
Yes /No
If No, Rationale/ approach to mitigate risk
Owner/Lead
Lead for Functional
Test
Lead for
Load/Peformance Test
Lead for Unit Test
Lead for Regression
Test
Lead for Integration
Test
Lead for User
Acceptance Test
Developing a Testing Schedule
Functional Testing
Load/Performance
Regression Testing
Integration Testing
User Acceptance Testing
6/4
5/28
5/21
5/14
5/7
4/30
4/23
4/16
4/9
4/2
The project testing schedule should be developed during the Test Planning stage.
To organize and plan the testing schedule, we recommend first outlining the high level timeline
for each test type, as indicated in the sample timeline below. It is important to note that the
schedule for Integration Testing and User Acceptance testing will need to be developed in
coordination with all of the projects that make up each service.
Sample
Once the timeline for each test type is defined, we recommend organizing each type into
specific testing cycles to make it easier to manage. A Test Cycle represents a high-level grouping
of Scenarios and Conditions. Often cycles are organized by functional areas or specific business
processes. The diagram below is a sample of how functional testing can be broken up into
5
Confidential and Preliminary. For use within University of Michigan only.
Functional Testing
Cycle 1:
Cycle 2
Cycle 3
Cycle 4
Sample
<<Insert Project Name>> Testing Schedule
The overall testing schedule for the <<Insert Project Name>> project is depicted below:
<<insert graphic of overall testing timeline>>
The functional testing schedule for the <<Insert Project Name>> project is depicted below:
<<insert graphic of functional testing timeline>>
6
Confidential and Preliminary. For use within University of Michigan only.
5/7
5/6
5/5
5/4
5/3
5/2
5/1
4/30
4/29
4/28
4/27
4/26
4/25
4/24
4/23
4/22
4/21
4/20
4/19
4/18
4/17
4/16
cycles. Each testing cycle should be an activity in the Project Workplan, with the specific
scenario and condition execution managed in the Test Scenarios, Conditions & Expected Results
deliverable.
Test Preparation
This section describes the processes, deliverables and recommended activities for the Test
Preparation stage.
Testing Structure Overview
To simplify planning and execution, tests are typically organized into a hierarchial structure.
The suggested testing structure for the Next Gen Program is outlined in the following table:
Name
Example
Description
Scenario
Condition
Script
High level area being
tested
Specific functionality
being tested
Detailed steps to test
functionality.
Migrate Existing User Data to New EUC Machine
Capture User State Data for Windows
1. Attach USB drive.
2. Run EUC Migrate Tool from the attached USB
hard drive
3. Etc…
Typically there is a hierarchical relationship between Test Scenarios, Test Conditions and Test
Scripts. Each Test Scenario typically has numerous conditions, and each condition can have
numerous scripts. Test Scripts in many cases may not be required if the Test Condition has
adequate detail to execute the test.
Test Script
Test Condition
Test Script
Test Scenario
Test Script
Test Condition
Test Script
Test Scenarios, Conditions & Expected Results Deliverable
The Test Scenarios, Conditions & Expected Results is the primary deliverable of the Test
Preparation and Execution Stage. The Test Scenarios, Conditions & Expected Results template
can be found on CTools in the following directory:
NextGen Program Resources / Templates / Program Deliverable Templates / Build-Test Phase
Templates / Test Sub-Phase Deliverable Templates
7
Confidential and Preliminary. For use within University of Michigan only.
The Test Scenarios, Conditions & Expected Results template is separated into three primary
tabs, Test Scenarios & Execution Status, Test Conditions, and Test Scripts. An overview of each
tab and how it can be utilized is discussed below.
Test Scenarios & Execution Status
The Test Scenarios & Execution Status tab is where each of the Test Scenarios are documented.
This tab calculates the Test execution and pass rates based on information in the other tabs. Test
Scenarios typically map to Business Requirements or Use Cases, and should be traced to the
appropriate document if applicable.
Test Scenarios & Conditions: Test Scenarios & Execution Status
Pass 1
Project Name: EUC
Scenario
ID
Business Requirement ID
or Use Case
Test Scenario
Pass 2
# of Test
Conditions
% Executed
% Passed
-
-
-
% Executed % Passed
-
-
Test Conditions
The Conditions Status tab is where each of the Test Conditions is documented, with the detailed
procedure and expected result. This tab also identifies which scenario a conditions is associated
with. This tab is used heavily during test execution, and is where testers update their status for
each pass.
Test Scenarios & Conditions: Test Conditions & Expected Results
Pass 1
Project Name: EUC
Scenario
ID
Test Scenario
Functl
Test
Req. ID /
Condition
Use
ID
Case
Test Condition
Test
Script
#
Test Procedure
Expected Result
-
Pass 2
-
Pass 1
Date
Pass 1
Tester
Pass 1
Status
Pass 2
Date
-
Pass 2 Pass 2
Test
Tester Status Incident #
-
-
Notes
-
-
Test Scripts
If a test script is required for a specific test condition, it is documented in this tab. This tab is
used heavily during test execution, and is where testers update their status for each pass.
Test Scenarios & Conditions: Test Scripts
Project Name: EUC
Test
Test
Step
Con. ID Script ID #
Action / Description
Expected Result
Pass 1
Date
Pass 1
Pass 1 Pass 1
Tester Status
Pass 2
Date
Pass 2
Pass 2 Pass 2
Tester Status
Test
Incident
#
Notes
The template is intended to provide a starting place and to give projects a tool to help them be
successful. The template should be customized to fit the needs of each project.
Using Requirements & Use Cases to Write Test Scenarios & Conditions
A Primary input to use when writing test scenarios and conditions is the information already
documented in the RTM and Use Cases. The following table outlines specific information in the
Requirements Traceability Document & Solution Architecture Blueprint (Use Cases) and how
8
Confidential and Preliminary. For use within University of Michigan only.
they could map to Test Scenarios, Conditions and Scripts. Generally it is recommended to
define high level scenarios first, and then write associated conditions and scripts based on each
Scenario.
Value in RTM
Name
Value in Use Case
Scenario
Business Requirement
Use Case Name or Alternative Flow
Condition
Functional Requirement
Use Case Step (box)
Script
Detailed steps to test
functional requirement (if
not detailed enough in test
condition)
Detailed steps to complete use case
step box
Test Environments & Data
The specific test environments and data utilized for each type of testing should be identified as
a part of the Test Planning stage and set up/mobilized for testing during the Test Preparation
phase.
<<Insert Project Name>> Test Preparation Activities
File Location of Test Scenarios, Conditions & Expected Results Document:
<<please insert link to test Secnarios, Conditions & Expected Results Document>>
Test Enviornments and Data
. Type of Testing
Test Environment
Test Data
Unit
List test environment, if applicable
List test data source, if applicable
Functional
List test environment, if applicable
List test data source, if applicable
Load/
Performance
List test environment, if applicable
List test data source, if applicable
Regression
List test environment, if applicable
List test data source, if applicable
Integration
List test environment, if applicable
List test data source, if applicable
User Acceptance
List test environment, if applicable
List test data source, if applicable
9
Confidential and Preliminary. For use within University of Michigan only.
Test Execution
This section describes the processes, deliverables and recommended activities for the Test
Execution stage.
Pass Strategy
To ensure that all test conditions are thoroughly tested, typically a 3-pass strategy is utilized for
testing. This strategy ensures that all test incidents are resolved and retested. It also tests that
the fixes did not cause any further test incidents to other parts of the service.
The three-pass approach is described as follows:



Pass 1. Execute the test cycle, finding as many test incidents as possible and ensuring that
these are logged and assigned to the fix team. It is not essential that the entire cycle be
completed if major test incidents are discovered.
Pass 2. Re-execute the entire test cycle, emphasizing testing the test incidents fixed from
Pass 1 (if required) and determining if the Pass 1 fixes caused any more test incidents.
Pass 3. Re-execute the entire test cycle, emphasizing testing the test incidents fixed from
Pass 2 (if required). No further test incidents should be found in this pass, but if they are, a
fourth pass may be necessary.
Based on the test incidents found and changes made during the test phase, test conditions may
be executed in several more passes. Pass strategy, as a concept, is only relevant for testing types
where formal test cycles are planned and executed.
<<Insert Project Name>> Pass Strategy
The table below outlines the pass strategy for the <<insert project name>> project.
. Type of Testing
Functional
Load/ Performance
Regression
Integration
10
# of
Planned
Passes
List number
of planned
passes.
List number
of planned
passes.
List number
of planned
passes.
List number
of planned
passes.
Description of each Pass
Describe each pass if approach varies from
information above.
Describe each pass if approach varies from
information above.
Describe each pass if approach varies from
information above.
Describe each pass if approach varies from
information above.
Confidential and Preliminary. For use within University of Michigan only.
Test Incident Management
Test Incident Management tracks and manages the discovery, resolution, and retest of test
incidents identified during test execution. Specifically, this process involves recording,
reviewing, and prioritizing test incidents; assigning test incidents to Technical Team Members
for fixing; and assigning testers to retest fixes.
The following diagram outlines the test execution and test incident management process:
Test Condition Execution & Defect Management Process
Test Conditiont
Assignment & Status
Test Condition
Execution
Test Condition Status
Test Lead
assigns test
condition to
Tester
Tester executes
test condition
and updates
status with
result.
Test
Condition
Pass/Fail?
Fail
Test Incident Logged/ Test Incident Review
Updated
& Assignment
TI is created in
Testing Tool.
Tester opens TI
and updates
Test Incident
with description.
Tester reports
test condition
status to Test
Lead
Pass
Fix Completed &
Migrated
Test/Tech lead
assigns defect
to team
member to fix
Test/Tech lead
reviews and
prioritizes
defect
Fix is
completed by
team member
Fix reviewed and
migrated (if
applicable) to test
environment
Team Member
updates test
incident resolution
& reassigns test
incident to tester
for retest
Test Incidents are typically logged and managed in a test incident tracking tool. For more
information on tools the Program recommends, please refer to the Test Tools portion of this
document.
Test Incident Information
When a test incident is created, specific information is collected regarding the incident to
determine the best course of action. The following table describes the typical information
collected for test incidents. The following table assumes utilizing a testing tool fo
Field
Description
(reason, use, reference)
11
Confidential and Preliminary. For use within University of Michigan only.
Options
Field
ID
Priority*
Description
(reason, use, reference)
TI reference number. System automatically assigns upon
1st save. A unique identifier.
Priority level for incident resolution
Status
Status of the TI activity
Created by:
Created by
Date:
Closed Date
Test Incident
Title*
Incident
Description
Name of person who created the TI
Date when TI created
Test Type*
Test Cycle #
Test
Condition #
Test Script #
Assigned to
Assigned to
date:
Date Ready
to Retest
Resolution
Description
Activity Log
Attachments
12
Options
Critical
High
Medium
Low
Open – New
Open – Assigned
Open - Ready for
Migration
Open – Ready for Re-test
Reopen
Duplicate
Deferred
Duplicate
Cancelled
Closed
Date entire TI is resolved.
Brief Description of problem or incident
Detailed description of problem or incident. Precede by
date and uniqname (02/26/02-firstlast). Place most recent
entry on top
At a high level the type of test being executed and is used
to group the testing incidents.
Identify the test cycle # that was being executed - from SIT
test Plan
Identify the test condition # that was being executed – from
SIT test plan
Test script number from reusable test plan if applicable
Name of person currently responsible for TI. Email sent to
person when this field changes.
Date the person in Assigned To field was changed
System generated based on above field being changed to
“Open – Ready to Retest)
Detailed description of resolution. Precede by date and
uniqname (11/12/04-firstlast). Place most recent entry on
top
Brief description of activity taken. Precede by date and
uniqname (11/12/04-firstlast). Place most recent entry on
top
Related attachments
Confidential and Preliminary. For use within University of Michigan only.
ITS name dropdown
Test Incident Priority
Each Test incident is typically classified by priority to ensure the appropriate level of attention
and turnaround time. The priority level is measured by the level of potential user, customer or
operational risk. The table below describes each of the priority levels:
Priority
Definition
Critical
Fatal error prevents testing from continuing without a fix. Test incident
significantly impacts the business’s ability to use the solution and/or its
ability to operate the solution. Must be fixed before go-live.
Testing can continue with a difficult, non-sustainable workaround. Test
incident significantly impacts the business’s ability to use the solution
and/or its ability to operate the solution. Must be fixed before go-live.
Testing can continue with a relatively straightforward workaround. Test
incident has a minor impact on the business’s ability to use the solution
and/or its ability to operate the solution. Business and IT must jointly
determine if a fixed is required before go-live.
Does not impact the business’s ability to use the solution and/or its ability to
operate the solution. Does not require a work around. Does not require a fix
prior to go-live. Normally classified as a “cosmetic” Test incident
High
Medium
Low
Test Incident Status
Each Test incident is typically assigned a status based on its progress in the test incident
lifecycle. The following table describes each of the status definitions:
Status
Open - New
Definition
Tester discovers Test incident
Open - Assigned
Test incident is assigned
Open – Ready for
Migration
Ready for Migration (if applicable)
Ready for Retest
Fix was completed and migrated, test incident is ready for
re-test
Test incident occurs again after re-test
Test incident is not in critical path, recommend fix to be
deferred post go-live
Test incident is a duplicate of a previously uncovered issue
Test incident is not valid, raised due to test condition error
Fix has been tested successfully
Re-open
Deferred
Duplicate
Cancelled
Closed
Test Incident Reviews
During the Test Execution stage, daily test incident review meetings are typically held by the
test lead of each team. During this meeting new test incidents are reviewed and assigned to the
appropriate Technical Team Member for fixing. Test incident status is also reviewed for all open
13
Confidential and Preliminary. For use within University of Michigan only.
test incidents. Having this meeting daily allows for frequent communication between the
testing and development teams and brings any critical issues to the surface in a timely manner.
<<Insert Project Name>> Test Incident Tracking
Location of Test Incidents
<<please insert link to document tracking test incidents or indicate tool being utilized>>
Test Incident Reviews
<<Please describe the process to review test incidents for your project. If any meetings are to be leveraged,
please indicate them here with how frequently they occur.>>
Test Monitoring
This section describes the processes and tools that are used to monitor testing during
preparation and execution.
Test Tools
SharePoint will be available for projects to utilize to simplift test execution, incident
management and provide enhanced reporting across the Program. Sharepoint can be utilized to
document/store test conditions and scenarios (in a similar format to the template described
above) and as a test incident tracking tool.
When a project completes their first draft of the Test Scenarios, Conditions & Expected Results
document, they can work with the Program Office to have the document migrated into a
SharePoint list. Utilizing SharePoint for test conditions in addition to tracking test incidents
simplifies the process to log test incidents, as a test incident will automatically be logged in
Sharepoint any time a condition or script fails.
For more information on utilizing Sharepoint for your project, please reference the training
materials located on the site,
https://collaborate.adsroot.itcs.umich.edu/mais/projects/nextgentesting/default.aspx
Progress Reporting & Metrics
Progress reporting is typically done throughout the Test phase to provide testing status. This
information assists leadership by providing data on how testing is proceeding compared to
plan, and how effective the development and testing phases are at fixing errors. Metrics are
used to manage test incidents, identify trends, report status, and improve processes. These
metrics aid in communicating the progress of the test and the status of dependencies.
14
Confidential and Preliminary. For use within University of Michigan only.
Utilizing SharePoint as a Testing tool allows for real time reporting of testing progress. The
following table lists some ofthe metrics that will be available to report on through SharePoint.
Metric
Percent Tests
Executed
Description
This metric defines the percentage of test
conditions executed to date, by cycle and at
a summary level.
Pass Rate
This metric defines the percentage of tests
that have passed out of the tests that have
been executed.
This metrics defines the percentage of test
incidents compared to the tests that have
been executed/
This metric identifies the average number
of days a test incident has been open, and
informs on how quickly test incidents are
being fixed.
Test Incident Rate
Test Incident
Average Age
Calculation
Tests Conditions
Executed/Total Test
Conditions (by Cycle and
Overall)
Passed Test
Conditions/Total Test
Conditions Executed
Test Incidents/Test
Conditions Executed
Sum of all Test Incident
ages/# of Test Incidents
<<Insert Project Name>> Test Monitoring
Test Tools
<<please note any tools being utilized to perform and/or manage testing>>
Progress Reporting & Metrics
<<Please note any testing reports or metrics that will be created during the test phase (if differs from
approach above)>>
15
Confidential and Preliminary. For use within University of Michigan only.
Appendix: EUS Testing Process Flow
NextGen Michigan Program – EUS Testing Process Flow
Test Preparation
DRAFT – For Discussion Purposes Only
Functional Testing, Load Testing, Regression Testing
Integration Testing
EUS Test Team performs
Integration Testing
EUS
EUS Test Team prepares for Integration & User Acceptance Testing
EUS Test Lead writes Test Approach
User Acceptance Testing Performed
Failed Test
EUS Test Team logs Test
Incidents in Program Test
Incident Tracker
EUS Test Team logs Test Incidents in Program
Test Incident Tracker
Exit Criteria
Met?
Fix sent for retest
Test Incident Assignment
Integration & User Acceptance Testing Schedule
Cloud, Storage, IIA, Portal, Network, Box
Exit Criteria Met
Failed Test
Fix sent for retest
Test Lead writes Test Approach
User Acceptance Testing
Test Scenarios,
Conditions
& Scripts
Inform EUS
Test Planning
Test Incident Assignment
Test Team Performs Functional, Load & Regression
Testing
Exit Criteria Met
Build Team works on fixes as Test
Incidents are logged
Build team works on fixes as Test Incidents are
logged
Failed Test
Fix sent for retest
Test team writes Test Scenarios,
Conditions & Scripts
Test Team Logs Test Incidents in
Program Test Incident Tracker
Test Incident Assignment
Program
Office
Build Team works on fixes as
Test Incidents are logged
16
PMO mobilizes Test
Reporting
Monitor Status & Assist with Reporting
PMO mobilizes Test
Incident Tracker
Manage Test Incident Tracker
Confidential and Preliminary. For use within University of Michigan only.
Deployment
Execution
17
Confidential and Preliminary. For use within University of Michigan only.
Download