ITD Performance Test - Iconserv Software Testing Services

advertisement
Performance Testing for TM & D
– An Overview
©Company confidential
1
Agenda
 Introduction & Performance Engineering Coverage on





TM&D
Consultation & Performance Engineering Phases
Sample Model for S&D Reporting
Data collection and Analysis
Post test Identification
Deliverables
©Company confidential
2
Objective
To have a high level understanding of
Performance Engineering Engagements for
TM & D
©Company confidential
3
Practical conversations on performance






“I think we need Performance Testing, but what is it exactly?”
“I think around 300 users will use the system, they will do all
kinds of activity, so can we determine performance ?
“Houston, this application is slow … do we need high-end
server?”
“What you will do with production data in performance?”
“Post Test charts looks nice, let me know the performance is
good or bad”
“Does that mean we’re done? Can we release?”
And a latest conversation two days back, one client called me
and asked
“ I am not winning the on-line bidding, can I send 20-30 bidding
requests
at a time, and increase my bidding chances?”
©Company confidential
4
Introduction

Performance Testing has three dimensions (Keeping the resource as
constant):
– Number of Users
– Amount of Data
– Amount of Activity

Any performance related issues can be mapped using these
dimensions

Performance is dependent on the following:
– Resources that are used by the application
• Computing Resources, Network Resources and Storage
Resources
– Bottlenecks and wait times
– The number of operations involved in performing the
computations and the time taken to complete these operations
©Company confidential
5
Performance Coverage on TM & D Systems
Outlet Data
Management
System
High Level Scope of Performance Testing
S&D Strategy,
Planning and
Evaluation
TM&D
Business
Performance
and Process
Assessment
 Testing integrated portal access
 Customer data query and updation
 Report generation (weekly/monthly)
- S&D
- ODMS (outlet specific reports)
- Key Accounts
- Performance and Process
 Offline data access and update
 Report Publishing in central system
Performance team will identify the other crucial
Transactions during System Study Level
Key Accounts/
HoReCa
©Company confidential
6
Our High Level Process Consultation
©Company confidential
7
Major Steps in Performance Testing
Confirm
Performance
Requirements
Factor/Metric
Load – Performance –
Stress testing
Reliability Testing
Security testing
Usability testing
Compatibility testing
Develop
Performance
Strategy/test plan
Develop Scripts,
test data
Set up Production
mirror image
Test design
Test Execution
Issue Management
System
Product Maturity Analysis
Continuous Analysis
Feedback to Dev team
Report issue/Concern
Post Test Analysis
Report Development
©Company confidential
8
Tool: Performance Automation
Solves the resource limitations
Controller
Analysis
 Replace Testers with Virtual Users
 Runs many Vusers on a few machines
 Controller manages the virtual users
 Analyzes results with graph & report tools
 Repeats tests with scripted actions
Vuser
host
Portal
Server/s
Load Generation




Database
server
TM&D System Under Test
Speed - Does the application respond quickly enough for the intended users?
Scalability – Will the application handle the expected user load and beyond?
Stability – Is the application stable under expected and unexpected user loads?
Confidence – Are you sure that users will have a positive experience on go-live day?
©Company confidential
9
Sample Transaction
Ability for the AE to generate the Forecast for the next quarter
on all of the Plan components
No of Users in System
60
50
50
45
40
35
30
40
30
35
User Load
25
20
20
15
10
10
10
5
5
1
0
0
2
4
6
8
10
Time in Biz Hour
Slow Ramp up,
Collect Client side,
Server Monitor mean
data
©Company confidential
Peak Hour Ramp up,
Collect Client side, Server
Monitor mean data
Ramp Down,
Collect Client side, Server
Monitor mean data
10
User Model
For each Transaction, User Model is created
based on






Authentication and Authorization of users
Feature Usage Criteria/Transaction Type
Total Number of Users in System
User random activities on a transaction
Transactions/unit time
Transaction MIX
©Company confidential
11
Online Monitoring
Some common data for online monitoring








Error & Exception
Response time graph
Throughput graph
Transaction passed failed
Page/component download time graph
Keep checking server logs
Capture the slow SQLs
Server Resource
©Company confidential
12
Performance Measurement

Workload Data

Data Characteristics
–
CPU Usage

Execution Characteristics
–
Memory Usage


Software resources:
–
Path characteristics
–
SQL Queries
–
Software resource usage
–
File I/O
–
Processing overhead
–
Messages
–
Logging to files or databases
Computer System Usage
–
Scenario Response Time
–
Calls to Middleware functions
–
Scenario Throughput
–
–
Key System Resource Usage
Calls to software in a different process,
thread or processor
–
Resource Utilization
–
Application Cache and Buffers
–
Server throughput
–
OS Handles like threads, memory,
sockets
–
network I/O and type of network
resource access,
–
number of connections to the databases
etc
©Company confidential
13
Assignment Deliverables
Pre-Test
 Performance Strategy for TM&D
 Performance Modeling for major sections of applications
 Resource Engagement outline and Effort estimation
 Delivery Process Methodology
 Performance Entry Criteria
On-Test








Detailed Test Plan
Test Scripts/Templates, Test design
Test Configuration Map (Data/Setup)
Test Tool, monitoring agent setup and PoC Report (LoadRunner)
Performance Exit Criteria
Test Results, Review log
Post Test Data
Risk/Issue log
Post-Test
 System/sub-system performance analysis Report
 Bottleneck identification
©Company confidential
14
Performance findings and Related Tuning











Code optimization
Caching strategy
Load balancing
Distributed Computing Logic
SQL Query Profiling
Usage of database indexing
Removal of Normalization
Resource configuration (RAM, Network)
Client software version (OS, Browser)
Identification of performance pattern
…
©Company confidential
15
 Most often performance is thought of as
something related to “tuning the code” –
this is perhaps the single most reason why
performance failures occur
 80% performance issues belong to
architecture and business definition – it is
impossible to get more than 10% benefit by
tuning code
©Company confidential
16
Your Questions are welcome
Thank you
Bangalore TEAM
©Company confidential
17
Download