Uploaded by jay Akselrud

ERP Program Testing Strategy v1

advertisement
MDC
Testing Strategy
ERP Program
SAP
Version: 1
MDC
Testing Strategy
Version 3
March 2019
Date: 4/8/19
MDC
Testing Strategy
ERP Program
SAP
Version: 1
Date: April 2019
Contents
1. Purpose & Scope ............................................................................................................................................................. 3
1.1.Purpose ........................................................................................................................................................................... 3
1.2 Scope ............................................................................................................................................................................... 4
2. Overview ......................................................................................................................................................................... 5
2.1 Release Overview ........................................................................................................................................................... 5
2.2 Testing Stages ................................................................................................................................................................. 5
2.2 Testing Environments to be used per Release .............................................................................................................. 7
2.3 Testing Roles and Responsibilities ..............................................................................................................................11
2.4 Security ..........................................................................................................................................................................11
3. Test Phase details ......................................................................................................................................................... 13
3.1 Unit testing ................................................................................................................................................................... 13
3.1.1
Technical Unit Testing ..................................................................................................................................... 13
3.1.2
Functional Unit Testing ......................................................................................................................... 13
3.1.3
Security Unit Testing .............................................................................................................................. 13
3.1.4
Source/Reference documentation ......................................................................................................... 13
3.1.5
Security roles for testing ......................................................................................................................... 14
3.1.6
Deliverables & tools ................................................................................................................................ 14
3.1.7
Data preparation ..................................................................................................................................... 14
3.1.8
Entrance Criteria .................................................................................................................................... 14
3.2 Integration testing ........................................................................................................................................................ 14
3.2.1
Source/Reference documentation ......................................................................................................... 15
3.2.2
Deliverables & tools ................................................................................................................................ 15
3.2.3
Data preparation ..................................................................................................................................... 15
3.2.4
Entrance Criteria .................................................................................................................................... 15
3.3 User Acceptance Testing .............................................................................................................................................. 16
3.3.1
Source/Reference documentation ......................................................................................................... 16
3.3.2
Entrance Criteria .................................................................................................................................... 16
Page 2 of 21
MDC
Testing Strategy
ERP Program
SAP
Version: 1
Date: April 2019
3.4 Parallel Payroll Testing ................................................................................................................................................ 16
3.4.1
Source/Reference documentation ......................................................................................................... 17
3.3.2
Deliverables & tools ................................................................................................................................ 17
3.3.3
Entrance Criteria .................................................................................................................................... 17
3.4 Performance testing ..................................................................................................................................................... 17
3.5 Month-end close testing .............................................................................................................................................. 18
3.5.2
Source/Reference documentation ......................................................................................................... 18
3.5.3
Deliverables & tools ................................................................................................................................ 18
4 Test Governance ........................................................................................................................................................... 19
4.1 Test Planning & Management ..................................................................................................................................... 19
4.2 Scope Control ............................................................................................................................................................... 19
4.3 Defect Tracking ............................................................................................................................................................ 19
4.4Issue remediation ......................................................................................................................................................... 19
4.5 Monitoring .................................................................................................................................................................... 21
4.6Stage gate ...................................................................................................................................................................... 21
4.7 Assumptions ................................................................................................................................................................. 21
1. Purpose & Scope
1.1.
Purpose
The testing strategy defines the approach, tools and methods used to verify the bt Project configuration, RICEFW
and business processes. It determines how the verification will be performed and in which phase of the project and
the cycles they will be executed in. Test case steps and execution will be documented throughout the various
testing stages to provide a library of reusable test cases for future regression testing activities.
Here is a high level overview of key testing types/activities over the ERP program phases, broken down into two
Release Rollouts (July-17 Release 1A and Jan-18 Release 2):
Page 3 of 21
MDC
Testing Strategy
ERP Program
SAP
Version: 1
Date: April 2019
This document describes the various aspects of Testing to support the bt Program initiative. The core objective is
to ensure thorough validation of the primary SAP and relevant third party applications that represent the future
state design for MDC. This includes the preparation, execution, validation, error-resolution, management and signoff of the application testing across the various stages of the program lifecycle.
Attached is the Gantt chart on the major milestones and timing of the testing phases.
1.2 Scope
The following items are in-scope:
a.
All SAP applications and modules, third party applications and cloud software in scope for bt including:
i.
ii.
iii.
CRB
EAM
HCM
Page 4 of 21
MDC
Testing Strategy
ERP Program
iv.
b.
Version: 1
Date: April 2019
FI
The primary user interface that will be used for the testing include:
i.
ii.
iii.
iv.
c.
SAP
SAP GUI 720
BI launch pad (browser-based), Business Objects
SAP Fiori LaunchPad
Third party application user interface
All configuration and RICEF development objects and security roles that were produced by the Bt project
team
2. Overview
2.1 Release Overview
Provided below is a key of the Releases for the WEC ERP program and the corresponding Go-Live dates and
major application focus.
0.S MedGate release to business users
1.A Release Go-Live is July 2017 for HR Talent – SuccessFactors (SF) Employee Central and SF Learning
Management Systems and Kenexa
1.S MedGate release to remaining business users
1.B Release Go-Live is July 2017 for Finance – Budget Processes: UI Plan
1.N Release Go-Live is August 2017 for HR Time and Labor, Payroll
1.Q Release Go-Live is October 2017 for HR Compensation and Open Enroll
2.0 Release Go-Live is January 2018 for ECC HANA
2.2 Testing Stages
The test strategy can, in general be qualified as a building block approach to testing. Each level of testing
increases in scope and control, until ultimately system performance on a production architecture can be simulated,
using production level transaction volumes and concurrent user activity.
Page 5 of 21
MDC
Testing Strategy
ERP Program
SAP
Version: 1
Date: April 2019
Each of the integration testing cycles will be preceded by a ‘mock cutover’ cycle in the same environment where
integration testing would be performed. There are also dress rehearsal (mock go live simulation) testing that will be
carried out to simulate the go lives for each of the two major releases.
This approach to system testing, utilizes a multi-phased approach as follows: The following table provides an
overview of the various test stages and the approach to system testing, utilizing a multi-phased approach:
Stage
Purpose
Executed by
Timing
Prerequisites
Data
Readiness for
Testing
Clean data created for unit
testing, validated by the
business
Functional and
Technical SMEs
February –
May
Technical
Unit Test
Technical object testing will test
all RICEF interfaces,
conversions and enhancements
at an individual level to ensure
that all design requirements are
met and no technical flaws exist
Functional and
Technical SMEs
February –
May
Subset of clean, accurate data
that has been validated by the
business; unit testing will have
a minimum of 10-25% clean
data and integration testing will
be 50-70% of the full data
expected
100% of test cases and
conditions defined; interfaces,
conversions and enhancement
objects built
Functional
Unit Test
To validate the individual
configuration of the SAP and
third party application-enabled
business processes to ensure
that individual transactions can
be completed as designed and
configured
Functional
SMEs
February –
May
Completion of individual
technical unit tests on
development objects
(RICEFW) and configuration
and data available for testing
Reporting
BW reports and data marts
have been validated and are
available for the business to be
able to query, verify and sign-off
on the infrastructure that was
scoped and built
Functional
SMEs
February –
May
Design of the data mart
Security
Unit Test
In order for unit testing to be
considered complete and
before integration testing can
begin, a round of securityspecific unit testing must be
completed to test security roles.
Positive testing will take place.
To ensure end-to-end
integration, data integrity and
alignment of all SAP and third
party application components to
support the various end-to-end
business processes. WEC user
roles (positive and negative) as
part of security testing will be
carried out in the IT cycles.
Functional
SME’s and/or
offshore
March
Completion of development for
each individual security role
against each T-Code
Initial round of
IT executed by
Functional
SME’s (Core
Team);
additional
rounds to
include
business users
There will be
five
integration
testing cycles
performed,
two for the
Jul-17 and
three for the
Jan-18
release
100% final configuration
complete and documented;
successful unit testing
including defect logging; 100%
of RICEFW coded and unit
tested; 100% of test scenarios
defined and documented;
100% of test scripts for IT1
assigned to resources and
loaded into SolMan; 75%
Integration
Test (IT)
Page 6 of 21
MDC
Testing Strategy
ERP Program
SAP
Version: 1
Date: April 2019
single and composite security
roles built
User
Acceptance
Testing
User Acceptance testing is
executed by the designated end
users with the assistance and
support of the project team. Any
functionality or process that
impacts an end user will be
tested to ensure that system is
compliant with end user
requirements.
Power Users +
Business
representatives
Will take
place
concurrently
with IT2 for
the July 2017
release and
with IT5 for
the January
2018 release
Development of all RICEF-W
objects, configuration and
security role testing has been
completed.
Parallel
Payroll
Testing
To ensure that the payroll
engine calculation between the
old and the new is the same
and the known differences (due
to moving from manual to
automated process) are fully
reconcilable.
Simulate month-end close,
consolidation work in BPC and
test Runbook functionality
Power Users +
Business
representatives
(for specific
deployment)
Throughout
the Testing
process.
Detailed
timeline in
section 2.2
Full time entry and related
configuration relevant to the
pay-period being tested is
completed
Power Users +
Business
representatives
BPC and Runbook build
complete, WEC financial data
in ECC QA environment and
completion of a full cycle of
integration testing
Validate/tune system
performance by executing
expected peak volumes of data
and transactions in the
Production environment
Technical team
with support
from Project
Core team
Projected to
take place the
month of
October. See
timeline in
Section 2.2
for more
detail
In parallel to
the last
integration
testing cycle
of each
release
Month-end
close
Performance
Test
Successful completion of two
rounds of integration testing
2.2 Testing Environments to be used per Release
Application
Sandbox
Development
QA
Production
ECC (H)
X10
D10
Q10
P10
ECC (O)
IMD
PAY
BW
X30
D30
Q30
P30
PO
X50
D50
Q50
P50
NWG
X15
D15
Q15
P15
Above table provides System IDs (SIDs) and clients for major SAP applications
Production
client
200 & 210
330
N/A
200
Page 7 of 21
MDC
Testing Strategy
ERP Program
TEST PHASE
Test Planning
Unit Testing
SAP
Version: 1
RELEASE
1.A; 1.B
1.N; 1.Q
1.A; 1.B
1.N; 1.Q
START
DATE
END
DATE
DURATIO
N
1/9/17
3/3/17
8 weeks
2/1/17
3/31/17
4 weeks
Mock load 1
(confirm duration
and # for all mock
loads)
1.A
3/27/17
3/31/17
1 week
Mock load 1
1.B
3/27/17
3/31/17
1 week
Mock load 1
1.N
TBD
TBD
IT1
1.A
4/3/17
4/21/17
3 weeks
IT1
1.B
4/3/17
4/21/17
3 weeks
Mock load 1
IT1
1.Q
1.N
4/10/17
TBD
4/14/17
TBD
2 weeks
IT1
1.Q
4/17/17
5/26/17
6 weeks
Parallel pay 1
2.0
5/1/17
5/26/17
Mock load 2
1.A
4/24/17
4/28/17
1 week
Mock load 2
Mock load 2
1.B
1.N
4/24/17
TBD
4/28/17
TBD
1 week
IT2
1.A
5/1/17
5/19/17
3 weeks
IT2
1.B
5/1/17
5/19/17
3 weeks
IT2
UAT
Performance Test
1
1.N
1.N
TBD
TBD
TBD
TBD
1.A
5/22/17
5/26/17
1 week
UAT
1.A
6/5/17
6/16/17
2 weeks
Date: April 2019
Systems
N/A
SF QA1, Kenexa Test,
ECC PAY010, PS QA,
PO Q50 , AD?, NWG
Q15:200, HCI
UIPlan QA, BW Q30,
ECC Q10:220
SF QA1, Kenexa Test,
ECC PAY010, PS QA,
PO Q50 , AD?, NWG
Q15:200, HCI
UIPlan QA, BW
Q30:330, ECC Q10:220,
PO Q50
TBD (need scope)
SF QA2, ECC Q10:210,
PO Q50, AD?, NWG
Q15:200
TDB
SF QA2, ECC Q10210,
PO Q50, AD?, NWG
Q15:200, HCI
TBD
SF QA2, ECC PAY010,
PS QA, PO Q50, AD?,
NWG Q15:200, HCI
UIPlan QA, BW
Q30:330, ECC Q10:220,
PO Q50
TBD
TBD
N/A
SF QA2, ECC PAY010,
PS QA, PO Q50, AD?,
NWG Q15:200, HCI
Page 8 of 21
MDC
Testing Strategy
ERP Program
SAP
Version: 1
UAT
1.B
6/5/17
6/16/17
3 weeks
End User Training
1
1.A; 1.B
6/5/17
6/30/17
4 weeks
Dress Rehearsal
(QA->Prod)
1.B
6/19/17
6/23/17
1 weeks
Parallel pay 2
2.0
6/5/17
6/23/17
5 weeks
Mock load 2
1.Q
6/19/17
6/23/17
2 weeks
IT2
1.Q
6/26/17
8/4/17
6 weeks
Cutover
1.A
7/3/17
7/16/17
2 weeks
Cutover
1.B
7/3/17
7/16/17
2 weeks
Go-Live
1.A; 1.B
UAT Jul
1.Q
8/1/17
9/14/17
Open Enroll 2018
1.Q
10/16/17
11/13/17 5 weeks
7/17
7 weeks
Date: April 2019
UIPlan QA, BW
Q30:330, ECC Q10:220,
PO Q50
UIPlan QA, BW
Q30:330, ECC Q10220
+ D10 350, PO Q50
TBD
SF QA2, ECC Q10:220,
NWG Q15200
SF QA2, ECC Q10220,
NWG Q15200
UI Plan Prod, BW
P30330, ECC P10210,
SF P, Kenexa P, PO
P50, NWG P15200, PS
P, Leg ECC P, HCI
UI Plan Prod, BW
P30330, NWG P15200
UI Plan Prod, BW
P30330, ECC P10210,
SF P, Kenexa P, PO
P50, NWG P15200, PS
P, Leg ECC P, HCI
SF QA2, ECC Q10220,
NWG Q15200
SF P, ECC P10200,
NWG P15200, HCI, PO
P50
Release 2.0
Test Planning
1/16/17
3/3/17
7 weeks
Unit Release 2.0
(Jan 2018)
2.0
2/13/17
3/24/17
6 weeks
Mock load 3
2.0
3/27/17
4/7/17
2 weeks
IT3
2.0
4/10/17
6/2/17
8 weeks
Mock load 4
2.0
5/29/17
6/2/17
1 weeks
ECC D10230, BW D30,
SF D, Kenexa T, Concur
T, WF D, PO D50, NWG
D15200
ECC Q10220, SF QA2,
WF QA, Concur, Ariba
test, Taulia QA, PO
Q50, NWG Q15200
ECC Q10220, SF QA2,
WF QA, Concur, Ariba
test, Taulia QA, PO
Q50, NWG Q15200
ECC Q10220, SF QA2,
WF QA, Concur, Ariba
Page 9 of 21
MDC
Testing Strategy
ERP Program
SAP
Version: 1
IT4
2.0
6/5/17
8/4/17
9 weeks
UAT-Jan 1
2.0
7/17/17
8/4/17
3 weeks
Budget comp 2018
2.0
7/17/17
9/15/17
9 weeks
Mock load 5
2.0
7/31/17
8/4/17
1 weeks
Parallel pay 3
Performance Test
2
2.0
7/31/17
8/18/17
3 weeks
2.0
8/1/17
9/2/17
Date: April 2019
test, Taulia QA, PO
Q50, NWG Q15200, BW
Q30330, BPC Q
ECC Q10220, SF QA2,
WF QA, Concur, Ariba
test, Taulia QA, PO
Q50, NWG Q15200, BW
Q30330, BPC Q
ECC Q10220, SF QA2,
WF QA, Concur, Ariba
test, Taulia QA, PO
Q50, NWG Q15200, BW
Q30330, BPC Q
UI Plan Prod, BW
P30330, ECC P10210,
SF P, Kenexa P, PO
P50, NWG P15200, PS
P, Leg ECC P, HCI
ECC Q10220, SF QA2,
WF QA, Concur, Ariba
test, Taulia QA, PO
Q50, NWG Q15200, BW
Q30330, BPC Q
DR environments TBD
IT5
2.0
8/7/17
9/29/17
8 weeks
Parallel pay 4
Dress Rehearsal 1
2.0
2.0
9/4/17
9/5/17
9/29/17
9/30/17
4 weeks
4 weeks
UAT-Jan 2
2.0
9/11/17
9/29/17
3 weeks
Month-end close
2.0
10/2/17
11/3/17
5 weeks
Budget entry 2018
2.0
10/15/17
12/20/17 10 weeks
ECC Q10220, SF QA2,
WF QA, Concur, Ariba
test, Taulia QA, PO
Q50, NWG Q15200, BW
Q30330, BPC Q
ECC Q10220, SF QA2,
WF QA, Concur, Ariba
test, Taulia QA, PO
Q50, NWG Q15200, BW
Q30330, BPC Q
ECC Q10220, SF QA2,
WF QA, Concur, Ariba
test, Taulia QA, PO
Q50, NWG Q15200, BW
Q30330, BPC Q
UI Plan Prod, BW
P30330, ECC P10210,
SF P, Kenexa P, PO
Page 10 of 21
MDC
Testing Strategy
ERP Program
SAP
Version: 1
Date: April 2019
P50, NWG P15200, PS
P, Leg ECC P, HCI
Dual Entry Parallel
Pay Pilot
Dual Entry Parallel
Pay Round 1
Dual Entry Parallel
Pay Round 2
End User Training
2
Dress Rehearsal
Go-Live
2.0
11/5/17
11/18/17 2 weeks
2.0
11/19/17
12/2/17
2.0
12/3/17
12/16/17 2 weeks
2.0
11/7/17
12/16/17
2.0
2.0
12/5/17
12/30/17
1/8/18
2 weeks
2.3 Testing Roles and Responsibilities
The following table provides an overview of the various activities across the different test stages throughout the
project lifecycle and the corresponding person responsible.
Activity
Unit test
Test Case Preparation
Test Case Execution
Functional SMEs
Functional SMEs
Test Case Issue
Resolution
Test Monitoring and
coordination
Sign-off
Test Lead
Integration test
UAT
Functional SMEs Functional SMEs
Functional SMEs Power User and
and select Power Select Site
Users
representatives
Process or Tech or Data or Reporting team depending on issue
WEC Leadership
Business Process
Sponsors
Business Process
Sponsors
Performance
test
Technical Team
Technical Team
Tech Lead
2.4 Security
Any security testing which has been outsourced to contractors (PwC, RCC, etc.) will need to go through an
approval process.
Unit testing:

D10, Client 230- Security requires broad access to provide to the teams for functional unit testing

D10, Client 240- Data or isolated client required for HR data to protect sensitive information (open item)

During unit testing, employees will be using the IDs that they currently use to login to Impact

Unit Testing for security will be in March - only positive
Page 11 of 21
MDC
Testing Strategy
ERP Program
SAP
Version: 1
Date: April 2019
Integration testing:
Typically broad access is given at the beginning of each cycle of integration testing for a release, however, WEC
will use specific roles assigned to users for the first round of IT for the 1.A and 2.0 releases, IT 1 and IT 3,
respectively.
Security testing will be performed as part of the Integration Testing with role specific security profiles. Positive and
negative testing will be performed against role specific security profiles. Negative unit testing will be performed as
part of integration testing. (IT1 for release 1.B, IT2 for release 1.A and IT4 & 5 for release 2.0)

Positive testing will be performed to ensure that transactions associated with a general role can be accessed
by the user at the initial screen level.

Negative testing will ensure profiles do not have access to fields that should be hidden and that data values
must fit standard format/boundary requirements
TEST PHASE
Unit Testing
1. A
All User
Access
IT1
IT2
1.B
Release 1
1. N
Release 2
1. Q
All User Access*
2.0
2.P
Role Base
All User Access
Access**
Role Base Access
All User
Access
Role Base Access
Role Base Access
Role Base
Access
IT3
All User Access
IT4
IT5
Parallel Pay
UAT
Role Base Access
Role Base Access
* All User Access: Testers will be given a standard team authorization
** Role Based Access: Production equivalent roles will be used for users for each business scenario step
Note:
1.
2.
BW security should have the same control points as ECC (from a view standpoint on restrictions
All releases will test single sign-on (SSO) and should have both positive and negative tests to
ensure a user can do what’s assigned and cannot execute tasks that are not assigned to their
role(s).
Page 12 of 21
3. Test Phase details
3.1 Unit testing
The objective of Unit Testing is to ensure that all unit level components of each larger business process are
correctly configured and adequately debugged. The term Unit testing is used to reflect that development programs
and configuration objects need to be tested both as individual components (units) to show that integration points
are matching up properly and all the components work together, ex. valid WBS numbers are interfaced to
Workforce software in near real time to use towards charging time. Functional testing involves testing business
processes within SAP and third party applications using standard, configured transactions.
Unit testing is the foundation for the larger, more complex Integration Testing. The general objectives of Unit
Testing are to uncover design or technical flaws at a unit level. The specific objective is to demonstrate that each
transaction or development object can be created, configured and function as expected within the individual
program itself. Each RICEF development object is required to have a completed Unit Test set. The application
developers and Functional Specification Owners are responsible for ensuring that development work is properly
debugged and tested at the unit level.
Unit Testing documentation and execution is the responsibility of the Tower Core Teams and the Application
Developers.
3.1.1 Technical Unit Testing
Technical object testing will test all RICEF interfaces, conversions and enhancements at an individual level to
ensure that all design requirements are met and no technical flaws exist. In addition to testing RICEF objects, the
Tech team will be responsible for any break-fix required as a result of functional and/or technical unit testing.
3.1.2 Functional Unit Testing
Functional unit testing will test the SAP and third party application-enabled business processes to ensure that
standard transactions can be completed as designed and configured. A functional test team will be responsible for
testing and documenting both successes and failures to either pass testing toll-gates or for escalation.
3.1.3 Security Unit Testing
In order for unit testing to be considered complete and before integration testing can begin, a round of securityspecific unit testing must be completed to test security roles. A combination of functional and offshore resources
will test the roles but the security team will be needed to build the roles based on design sign-off. Both positive and
negative testing will take place.



1,200 t-codes, with 200 roles, expecting 10 minutes per t-code (breakdown to come from Val)
Negative tests will not be conducted until integration testing
Single sign-on (SSO)
3.1.4 Source/Reference documentation





Configuration Document (SolMan).
Functional Specifications (SolMan)
Technical Specifications (SolMan).
Requirements Traceability Matrix (SolMan).
Unit test case repository and testing tool (SolMan).
3.1.5 Security roles for testing



D10, Client 230- Security requires broad access to provide to the teams for functional unit testing
D10, Client 240- Data or isolated client required for HR data to protect sensitive information (open item)
During unit testing, employees will be using the IDs that they currently use to login to Impact
3.1.6 Deliverables & tools


Completed unit test cases with results.
Unit test documentation, loaded into SolMan
3.1.7 Data preparation

Test data in the Development Environment will prepared by the corresponding tower teams manually or use
load programs to create master and transactional data.
3.1.8 Entrance Criteria

Configuration/ Development/ Security role creation of the object is complete
3.2 Integration testing
The objective of this testing phase is to verify that the delivered system including all custom configuration,
enhancements, data conversions, interfaces, reports, forms, workflow, user procedures, compliance, and security
execute correctly after being merged together in a single stage test environment.
Unlike unit testing, the focus of Integration Testing will be broader, with a focus on the combination or chaining of
multiple units into full business processes. For example, a unit test will challenge a RICEF object or Process
Scenario (involving multiple Business Process Procedures) for instance can the time being entered for PM orders
for the Oak Creek Plant operators, while an integration test will cohesively test if those hours are successfully
transferred to PM order and costed by SAP FICO module for labor distribution purpose.
Integration testing will focus on ensuring that all SAP systems along with integrated third party packages and
interfaces meet the business requirements of the end-to-end business processes. As there are two rollouts- July2017 and January-2018 respectively, there will be five rounds of integration testing carried out the timing of which
is explained in further detail in n Section 2.2 ‘Testing Timeline’. IT1 and IT2 will be focused on the July-17 1.A and
1.B Release and IT3, IT4 and IT5 will be for the January 2018 2.0 Release.
As part of IT2 and IT5, user acceptance testing (UAT) will be included as well. See Section 3.4 for more information
on UAT.
WEC business users will thoroughly test the high-risk areas of the solution. Examples of these tests may include
process, interfaces and security testing. Areas of low risk are likely to have been exercised by a sub-set of tests in
the Integration Testing Phase and can be lightly tested in the User Acceptance Testing.
Prior to and during Integration Testing, the Tower Teams will be evaluating the completeness, accuracy, and
validity of compliance sensitive test strings, test data, and test results, as well as the soundness and reliability of
the testing process itself.
Interfaces and any enabling technologies will be executed as part of the Integration testing. The high and critical
Interfaces, Enhancements and all conversions will be part of the first integration testing cycle and executed by a
combination of PwC and WEC resources. In the second and third integration testing cycles all RICEFW will be
tested.
Error testing will be performed as part of Integration Testing. Testing will be conducted to verify that specific fields,
identified as critical per the FSD and/or BPDs, have values within a range of acceptable values, and will correctly
flag for error data outside the acceptable values. All high risk areas will need to be “negative” tested. The Test
Coordinators or Business Leads Tower Leads may also identify additional areas for negative testing at their
discretion.
Integration testing will have multiple test cycles which will be identified in the detailed test plan. It will be an end-toend business process test, including data conversion, inbound and outbound interfaces, transaction processing,
and batch processing, and reporting. The goal in this phase is to emulate the production environment as closely as
possible. During this phase of testing, it is expected that configuration gaps will be identified. Program logic may
need to be modified, batch jobs may need to be tuned, and authorization and profile changes will need to be
updated. All the incidents that arise as a result of exercising the system and the resultant enhancements and
special fixes will be documented.
Resolutions will be closely monitored by the Functional SMEs to ensure that all Issues have been properly
resolved. As determined by the Test Lead and Functional SMEs, an appropriate selection of unit test strings that
are affected by a modification will be re-tested after the correction is transported back into the stage environment.
Based upon the test results, a go/no-go decision will be made whether to proceed to UAT or to carry out additional
Integration testing.
3.2.1 Source/Reference documentation
1.
2.
3.
4.
5.
Configuration Document (SolMan).
Functional Specification (SolMan).
Technical Specification (SolMan).
Requirements Traceability Matrix (SolMan).
Integration case repository and testing tool (HP QC).
3.2.2 Deliverables & tools
1.
2.
3.
Completed integration test cases with results.
HP QC (for test documentation).
Integration testing sign-off
3.2.3 Data preparation
1.
Test data in the QA Environment will be based on test data loads by the tech team based on the
requirements defined by the tower teams. Required data may be prepared and loaded manually by the
process teams.
3.2.4 Entrance Criteria

Complete Build & Unit Test of Configuration
o 100% Final Configuration Complete and Documented



3.3
o Successful unit test completed including pass / fail and logging of associated defects
Build & Unit Test RICEFW
o 80% of critical, high and medium Interfaces, Conversions, Enhancements coded and unit tested
o 50% of critical, high, and medium Forms, Workflow and UI coded and unit tested
Complete Test Preparation
o 100% of test scenarios and conditions defined (Defined Scope of Testing)
o 100% of test scripts for IT1 completed and 75% for SIT2 and SIT3
o 100% of test scripts for IT1 assigned to resources and loaded in testing tool
o At least 90% of cycle / string tests passed
Other Build Activities
o 75% Single and Composite Security Roles Built
o Critical and high priority data cleansing and 70% of all data cleansing and collection complete
o Plan developed for manually executed data migration objects
User Acceptance Testing
User acceptance testing will align with IT 2 and IT 5 for 1.A and 1.B Release in July 2017 and for 2.0 Release in
January 2018, respectively. See the timeline in Section 2.2 for more information on dates.
The objective of UAT is three-fold:
1. To provide selected end users with the opportunity to manually navigate through the system to get a "look and
feel" validation and simulate a "day in the life" of the future state design.
2. To exercise sufficient portions of the system to demonstrate that the delivered system matches all designated
design criteria (i.e. is validated) and that there are no gaps in the business process from the perspective of the
business users.
3. Gain feedback from a select group of business users identified as champions of the project for the rest of the
business and obtain sign off from the business
3.3.1 Source/Reference documentation
1.
Configuration Document (SolMan)
2.
User Acceptance test case repository and testing tool (HP QC)
3.3.2 Entrance Criteria
1.
Stable solution having completed a full cycle of integration testing
2. Users have been trained and are available for testing
3.4 Parallel Payroll Testing
The objective of parallel payroll testing is to validate that the full cycle payroll run, based on the 1) new time entry
information to legacy ECC payroll for Release 1.N and 2) payroll run on the HANA SAP Payroll engines is
consistent with the payroll run in the legacy environments and any differences are known and accounted for from a
reconciliation purpose. This would involve selecting entire payroll groups- be it the 420 union or WEC
management, for example, for certain pay period and executing it end to end and verifying every single payroll
posting and wage type determination at the lowest level.
3.4.1 Source/Reference documentation
1.
2.
3.
4.
Payroll run results in the legacy environment to compare the payroll results against.
Configuration Document (SolMan).
Functional & Technical Specification (SolMan).
Requirements Traceability Matrix (SolMan).
3.3.2 Deliverables & tools
1
2
3
Completed user acceptance test cases with results.
Completed reconciliation test results for comparing the legacy run against the new rules
HP QC (for test documentation).
3.3.3 Entrance Criteria
1.
2.
3.
Completion of all payroll
Reports completed on gross pay and time to be able to compare against results from SAP Hana
Completed enhancements and configurations for payroll to be completed
3.4 Performance testing
The objectives of Performance testing include:






Testing the system to ensure that the applications and databases will support the expected transaction
volumes and batch processes.
Test the critical interfaces between SAP On-Prem, WorkForce, PowerPlan, Ariba, Taulia, Concur, Runbook,
WEC and TEG WMIS applications using the ‘worst case’ average daily and monthly inbound/outbound
volume of data
Testing the WAN and Network Hardware to validate that the total expected activity and traffic, both SAP and
non-SAP, will be adequately supported.
Testing will be conducted to stress the system from end-to-end and measure and analyze the response
times and/or times of key business processes.
Tests will target areas of the program that pose performance bottlenecks.
The goal is to demonstrate that the delivered system meets specified performance expectations.
Performance testing will target areas of ERP and relevant third party application system with the following
characteristics:






Areas with extensive customization
Areas where a high volume of transactions are expected
Areas vital to the business, from a performance perspective
Custom development or user exits
Enhancements and BI reporting
Areas that experienced high levels of Issues in previous testing phases
3.5 Month-end close testing
For the Release 2.0, the production environment for the January release will be stood up and running by the
beginning of November 2017. The objective of month-end close testing is to use the “clean conversion data” from
legacy systems to run:
a) end to end month-end close simulation;
b) another set of parallel payroll run and;
c) final simulation including A/P invoice management process, PP&E and tax integration in full ‘production
simulation’ mode as part of final cutover activity.
During this testing, select business users would represent the set of testers who would be validating the critical
processes prior to final cutover and go-live.
3.5.2 Source/Reference documentation
1.
2.
3.
4.
5.
6.
Legacy environment payroll run results and compare against the ECC Hana payroll results
Month end close calendar and autosys jobs
Configuration Document (SolMan).
Functional Specifications (SolMan).
Technical Specification (SolMan).
Requirements Traceability Matrix (SolMan).
HP QC (for test documentation).
3.5.3 Deliverables & tools
1.
2.
Completed final simulation test cases with results.
Completed reconciliation test results for comparing the legacy run against the new rules for payroll, for month
end the ability to run them without errors and performance issues
4 Test Governance
4.1 Test Planning & Management
Unit Test planning will be managed through a SharePoint database and conducted manually using HP-QC
compatible excel templates, through Solution Manager. Integration test planning and execution will be conducted
using HP QC software. The unit test cases will be traced to the business processes, requirements and RICEFW
Objects. This will feed the Requirements Traceability Matrix that will be housed in Solution Manager. The WEC and
PWC Test Leads will be responsible for the overall management of the mentioned test cycles.
4.2 Scope Control
A technical team member tracks all RICEFW requests and circulates to PMO for approval through their standard
scope control governance process. Any changes to test scripts will be handled through the testing team’s standard
governance process on with controls in place to ensure that a test script is not changed with prior documentation.
4.3 Defect Tracking
There will be two defect tracking processes, one for unit testing and the other defect tracking process will be
initiated for integration testing and the following cycles of testing.
Unit Testing
Once users begin the testing process, they will encounter instances where the test scripts do not produce their
intended result. In these cases, testers will log defect via a SharePoint Defect Log in order to be monitor and
tracked.
Integration, Payroll, Month-end, User Acceptance Testing
Functional and technical users will log defects in HP QC once the system is stood up in March.
4.4 Issue remediation
During the execution of testing stages (Unit Test, Data conversions, Integration/Regression, Stress/Volume and
User Acceptance Test), it is expected that problems will be identified. These problems can include:






Issues or defects with system functionality (ex. Configuration)
Issues or defects with RICEF functionality (ex. interface functionality)
Issues or defects with technical integration (ex. RICEFW programs, batch or web service programs not
correctly integrated with the QA environment)
Issues or defects with security user profiles or access
Issues or defects with converted master data, transactional data or any data related issue that will require
data creation, cleanup or corrections
Issues or defects with documentation (ex. test sets with incorrect or missing steps, etc.)
Issues discovered during testing may require adjustment to configuration, data, or the test scenarios. Issues that
are revealed during testing will be recorded as issues in SharePoint Defect Log or HP QC with appropriate due
dates and team assignments. Issues will be administrated and tracked to resolution by the Defect Manager. All
break/fixes must first be tested in both the Development and QA environments before being transported to the PRD
environment. Here is a high level process flow of how the issues/defects will be handled:
The different severity levels are:

Critical – Showstopper, no workaround

High – Cumbersome or very difficult workaround

Medium – Workaround but needs to be fixed

Low – Nice to have
The different status levels of a defect are:

Open – Defect is initially identified and assigned

In Progress – Defect is currently being worked on by a developer

Fixed – Defect has been fixed by a developer

Testing – Tester is re-testing the defect

Rework – Defect needs additional fix(es) after testing

Closed – Defect is fixed and passed testing
The exit criteria for moving into the next cycle are:

Critical and High severity defects must be resolved to exit test cycle.

Medium severity defects need to be evaluated to determine appropriate action.

Low severity defect does not need to be resolved to exit test cycle
4.5 Monitoring
Unit Testing statistics (tests planned/executed/failed, top issues) are reviewed on a weekly basis as part of the
regular team lead meetings using the build plan scorecards to keep track of the progress throughout the whole
project lifecycle. Issues are recorded in HP QC as defects. Testing will be continuously monitored by the test leads
and team coordinators to ensure planned activities are moving and testers know what to do each week
Integration Testing, Parallel payroll and Performance Testing monitoring will be executed on a daily basis with a
dedicated core team to ensure rapid resolution and escalation.
4.6 Stage gate
Integration Testing will not commence until successful unit testing of integration-relevant components and security
testing are successfully completed. The Solution Architect and Test Lead will define the detailed criteria for
identifying the integration-relevant components as part of the Integration Test Planning activities as defined with the
project workplan.
Successful completion of UAT, Parallel payroll and Performance Testing are prerequisites for the Final Go-Live
preparation. We cannot proceed with Final Preparation activities (Trial Cutover, Cutover/Migration, etc.) without
successful completion of Final Parallel go-live testing being complete.
4.7 Assumptions
1.
2.
3.
4.
5.
6.
New test domains will be used for integration testing because of new Active Directory and new
Intranet projects
Training rooms must be able to remote access into new domains
Run payroll in legacy ECC as a part of cutover
Month end close will use the same environments as Release 2.0 integration testing
Dual entry parallel pay testing will use the same environments as Release 2.0 integration testing
Planning will be done ahead of time to identify user IDs to include in the first round of the Active
Directory mass migration so their new domains can be used to complete UAT for Release 1.A
and 1.B, IT2 for Release 1.Q, IT2 for Release 1.N and IT4 for Release 2.0
Download