Case Study: A Large US based Manufacturing

advertisement
End to End testing of a complex ERP
footprint for
Logistics Service provider in US
By
Sathish Rajamani and Jayanta Kumar Rudra
8th Dec 2011
©2011, Cognizant
0
Agenda
1
Client Overview & Voice of Customer
2
Key Challenges & Mitigation
3
Cognizant Solution & Key Metrics
4
Benefits & Value Adds
5
6
1
1
| ©2011, Cognizant
Client Overview
2
2
| ©2011, Cognizant

Leading Logistics service provider in the US

Expanding their operations across Canada, Mexico and China

A Premium service provider catering to more than two-thirds of
the FORTUNE 500® companies

Operates 14,000 tractors, 15,500 drivers, and 48,000 trailers
with 21,500 associates operating in 28 countries

Client had embarked on a major IT transformation project of
moving from legacy to ERP.

They wanted to improve the QA process as lacking on below
points

No dedicated QA setup

Lack of sufficient Performance benchmarks & SLA metrics

Testing in silos by Business users

No automation of existing regression test cases
3
3
| ©2011, Cognizant
End-to-end Multi-Platform Testing Path
Vendor Managed
Application/ Legacy
Transportation Management
Creating customer orders
EDI
Portal (Web center)
Email
Releasing orders to OTM
(Order Status – Released)
Pricing
decision
Order
Management
M5
CRA
Qualcomm
Maptuit
TMG
| ©2011, Cognizant
HR
OLM
OIC
GL
AR
AP
RAIR
Revenue &
Network Mgmt
Shipment
Assignment
Order updates flowing from
OTM to Siebel
4
4
SOA (BPEL/ESB)
Quote, Order
Financial
validations
Completion of special
services
Agreement to quote Order
Pricing Decision Support
Drop & Swap
SOA (BPEL/ESB)
Demand Generation
CRM
Order Management
Claims
Billing
Network Management
Rate Server
Pricing
Geography
Assigning driver/SP to the
shipments
Order Execution
Back office
Support
Our Solution through MTC
MANAGEMENT
PROGRAM
TEST
MANAGEMENT
OFFICE
5
5
Data Validation
Testing
End-to-end Test Solution
FUNCTIONAL
TEST LEADS
| ©2011, Cognizant
AUTOMATION
LEADS
Smoke
Testing
Performance
Testing
Test Data
Management
Interface
Testing
Security
Testing
FUNCTIONAL
NONFUNCTIONAL
SMESs
ACROSS LOBs
TEST
MANAGER
Simulation
with Mobile
Testing
Disaster
Recovery
Testing
UAT
Support
Consulting
on Metrics
Management
Automatio
n Testing
PROGRAM
MANAGER
CORE TEAM
FUNCTIONAL
TEST ANALYST
Test Co-ordination
Infrastructure
&
with
other Vendor
Environment
BEST-IN-CLASS
QA FRAMEWORK
Automation
Testing
AUTOMATION
TEST
ENGINEERS
NONFUNCTIONAL
TEST
ENGINEERS
METRIC
CHAMPIONS
SOA TEST
SPECIALIST
FLEX TEAM
QUALITY
ENGINEERING
KNOWLEDGE
MANAGEMENT
ENTERPRISE TOOL
ALLIANCES
SOLUTION
ACCELARATORS
DELIVERY OFFICE
Delivery
Delivery
Management
Management &&
Reporting
Reporting
with
Tool
& Innovation
Development
Framework
services
Organization
Scheduling &
Scheduling &
Release
Release
Management
Management
Benchmark &
Benchmark &
Metrics
Metrics Publication
Publication
Process
Process
Standardization
Standardization &
& Compliance
Compliance
EDI
Testing
Regression
Testing
SERVICE
TOWERS
INFRASTRUCRURE
MANAGEMENT TEAM
TEST CENTER
Functional
Testing
Legacy & Vendor Managed
Application
Siebel CRM, Analytics, Oracle Fusion
SERVICE
CATALOUGE
OFFICE
Oracle EBS-Finance(AP,AR,GL,HRMS,OLM)
Oracle Transport Management
Key Challenges & Mitigation
• Application
Complexity
• End to End Testing
• Non availability of
skilled resources
• Integrated test approach
• Cognizant
BA–PA-QA Model
• Delivery management with
strong TMO
• Core Flex Team composition
• Domain & product knowledge
Mitigation
• Closely work with Dev
or Production Support
team
•
• Intermediate
Version Upgrade
• System Instability
• Performance Issue
6
6
| ©2011, Cognizant
Leveraging Cognizant’s
Partnership with Oracle
•
Daily Patch Testing
•
Comprehensive
•
Regression Testing
• Cognizant’s BIC (Best In
Class Metrics)
•
QA Strategy by Efficient
PMO
•
Involvement of Test
Consulting Group & DAG
(Delivery Assurance
Group)
• Managed Test Service
• Lack of well defined
QA Strategy and
Build plan
Achievents and Key Metrics
7
7
| ©2011, Cognizant
MTC Achievements
Jan’12
Delivering Enhanced Values
Jun’11
Jan’11
Matured QA
Organization
Jun’10
Till Nov’09
Pre-TCE State
 No dedicated QA
set up
 Performance
issues due to
siloed testing by
business users
8
8
Dec’09
TCE Set up &
Steady State
 TCE Initiation
 POC for Automation
& Business Test
cases
 System Test
Strategy and testing
for R7 for Siebel &
EBS
 TCE strength : 8
(Onsite 6/ Offshore
2)
| ©2011, Cognizant
TCE in Managed
Service Model
System testing
 Four cycles of R7 System
testing
Business testing
 Created Test Scenarios ,
Steps & Clicks
 Introduced Test Manager as
a Test Repository tool &
migrated all Excel based test
cases to the tool
Automation testing
 Use of automated data
creation from Cycle 3 System
Performance testing
 Setup performance testing
process and establish
Planning, estimation models
and Performance execution
standards for the client
Awaiting New
Release
Supporting Business Test Execution
Creation of Innovation council
Regression Testing for Release 7
Fix pack & Sustainment release for 7.1
Adopted Test Matrix approach for FIT
saving a lot of Test Preparation time
 Smoke Test for UAT Environment, EDI
Testing , Accessorial Testing were
added to scope
 200 performance issues identified





System testing
 Extensive test case repository
(SOARS) built across modules
 Peak team size : 70
Business testing
 Prepared Test Data Sheets
 Kick started AIT Business Test
execution
 Test scheduling and management
Support
 End-to-end multi Platform testing
 Better Onsite – Offshore ratio
(25:75)
Automation testing
 Automation of the end to end
flow starting from Order creation
until execution, providing 60 –
70% automation coverage for
Business Test scenarios
 Automated Data conversion &
transition
Performance testing
 Around 1000 performance tests
were conducted in last 12 months
Tools
Client
Supplied
In House
Accelerators
SOAP UI, Open
Script, Silk
Performer, Clarity,
Test Manager
SOARS, TCP, CRAFT,
ADPART, C2, ROI,
Script Accelerator,
Test Scenario
Accelerator, SiFA,
Script Sanitizer
Performance & Automation Testing
Highlights
9
9

Component testing

Web Service testing (Through SOAP UI)

Load Test

Interface testing

Stress test

Endurance testing

Bandwidth testing

Network device testing

Program Performance Test
Accomplishments

Better capacity planning and decision on hardware &
software requirement


Significant cost reduction due to

Early life Performance testing

Zero post-production defects
Executed 1x, 2x and 3x testing
| ©2011, Cognizant

Generated 1650 Siebel Orders using Automated
script in System Test Environment

Automation of 250 Business Components and
400 Scenarios

Automation of initial Prototype E2E Scenario
complete.

Build Automation frame work release based
Testing
Automation Testing
Performance Testing
Test Coverage
Approach
Structured QA Process
18:82
Offshore Leverage
22:78
Offshore Leverage
35:65
Test Design
Offshore Leverage
15%
Productivity Improvement
75%
99%
95%
NA
Pre Engagement
PM1
PM2
PM3
DEV1
DEV2
DEV3
LOB1
....
....
....
....
.... ....
....
DEV1
BROKERAGE
Steady State
LOB2
....
....
....
PM2
....
....
....
DEV2
....
....
....
Channels
TESTING
TEAM /
CHANNELS
Products
TESTING
TEAM /
PRODUCTS
Support
TESTING
TEAM /
SUPPORT
NFT / Automation
Testing Team
....
Testing
Team
10
10
Test Effectiveness
Test Effectiveness
Post Engagement
Test Effectiveness
STRATEGY, OPERATIONS, ASSURANCE
95%
NA
NA
Test Effectiveness
Estimation Accuracy
Estimation Accuracy
Estimation Accuracy
....
Estimation Accuracy
Schedule Accuracy
Schedule Accuracy
NA
LOB2
>95%
95%
Schedule Accuracy
LOB1
>95%
Schedule Accuracy
NFT
/ Automation
TEMS
Integrated Delivery
NFT /Automation
TEMS
COE’s
Delivery Excellence
Jun 2009
Dec 2009
Jun 2010
Jun 2011
Resource based
Requirement based
Service based
Integrated
| ©2011, Cognizant
FINANCIAL
SERVICES
Productivity Improvement
NA
Productivity Improvement
Productivity Improvement
LOGISTICS
NA
Test Execution
Test Execution
TRUCKLOAD
9%
Test Design
Test Design
Test Execution
INTERMODAL
NA
Review Efficiency
33%
25%
Metrics Dashboard
Key Metrics
Test Stages
Application Health
Check
Planning
Test Metrics Definition
Benchmark
Settingand
Bench
Mark settings
•
•
•
•
•
•
11
11
Test Coverage : 100%
Automation Coverage: 76%
Performance Testing (all critical business transaction) : 100%
Schedule Slippage: 0%
Effort Slippage: 0%
Avg Productivity in Test Execution: 9%
| ©2011, Cognizant
Test Metrics Definition
Design
- Test design Coverage
- Test design productivity
- % Rework Effort
- Review Efficiency
- % of defective test cases
Execution
- Test execution coverage
- Defect Density
- Test Execution Rate
- Defect leakage
- Defect Trends
Metrics Report – quick glance
Closure
Project Health Check
- Application Stability
Index( Quantitative Go/NoGo Decision)
- Defect Removal Effy.,
- % Effort Variation
- % Schedule Variation
- % Duration Variation
- % Effort Variation
- % Schedule Variation
- % Duration Variation
- Load Factor
- Productivity trends
- Goals Vs Actuals
Benefits & Value Adds
12
12
| ©2011, Cognizant
Quarter to Quarter Optimization
Improvement Opportunities
Optimization Levers
Increase
Customer Experience
Operational Efficiency
Productivity Gains
Application
Maintenance Cost

Process Automation

Solution Accelerators/Tools

Knowledge Repository
Effort

Non-Linear Estimation Model
Cost of Quality

Onsite-Offshore Delivery Model
Time to Market
Redundancies
Right first time
Decrease
IT Business Alignment
Best Practices

Led Oracle ATS Tool Rollout

Created several Tools and Accelerators -
% of Productivity gain
13
13

1st Qtr – 5%

2nd Qtr – 7%

3rd Qtr – 9%

4th Qtr onwards – 12%
| ©2011, Cognizant
Spider utility and macros

Defined System Test Strategy for R7

Instituted matrix based FIT test design
approach

Built dashboards for test execution tracking
Innovation
Spider Utility:
Test Scenario Accelerator:
14
14
| ©2011, Cognizant
Effort Savings in % compared to Manual Testing
Did you know?
15
15
| ©2011, Cognizant
Thank You
©2011, Cognizant
16
Download