IBM(R) Rational Team Concert(TM) System
Performance Testing
Anu Ramamoorthy
Staff Software Engineer, IBM Rational Software
ranuradh@us.ibm.com
IBM(R)
Rational Team
Concert(TM)
System
Performance
Testing CRM30
IBM(R)
Rational(R)
Quality
Manager
in a Globally
Distributed
World
© 2009 IBM Corporation
IBM Rational Software Conference 2009
Session Speaker
 Anu Ramamoorthy, Staff Software Engineer
RTC System Verification Test Performance
Leader
 Anu Ramamoorthy is a Staff Software Engineer
working on the Rational System and Integration
Test Team. She has worked on a number of
test automation and performance projects for
IBM Rational Software including ClearCase
and ClearCase Remote Client (CCRC). She
has been a part of IBM Rational for the past 5
years. Currently, she is the leader for the RTC
SVT Performance testing efforts.
IBM(R) Rational Team Concert(TM) System Performance Testing CRM30
IBM Rational Software Conference 2009
Disclaimer
 Each IBM Rational Team Concert installation and configuration is unique.
The performance data reported in this document are specific to the
product software, test configuration, workload and operating environment
that were used. The reported data should not be relied upon, as
performance data obtained in other environments or under different
conditions may vary significantly.
 Due to the sensitive nature of the system testing which is often based off
of specific information from customers, the results of our testing will not be
revealed except through official sources like developerWorks and release
readiness reports.
IBM(R) Rational Team Concert(TM) System Performance Testing CRM30
IBM Rational Software Conference 2009
Session Overview
Today, the need for globally managed operations and lower IT infrastructure
costs is becoming a reality for many companies. Companies must be able to
adapt their environment to changing demands without having to sacrifice good
performance.
In this session, we will discuss our objectives, use of customer-based workloads,
and results from RTC 2.0 SVT performance testing. This includes:
 Our observations around the scalability of RTC 2.0 in a typical customer deployment.
 Our observations of performance impacts in a high availability network configuration.
 Our findings on the impact to RTC server performance with large numbers of contributor
licenses.
 Performance data resulting from network conditions provided by customers at VoiCE
2008.
Due to the forward-looking nature of this presentation, slides will not be
made available prior to the session.
Please check back here after the conference for updated materials.
IBM(R) Rational Team Concert(TM) System Performance Testing CRM30
IBM Rational Software Conference 2009
Agenda
 Background
 Test Strategy
Establish Baseline for Workload
Develop Automation
Measure Server Load
Collect Metrics
Lab Infrastructure
Repository Details
 Test Results
Scalability Testing Results
High Availability Testing Results
Large Contributors Testing Results
WAN Testing Results
IBM(R) Rational Team Concert(TM) System Performance Testing CRM30
IBM Rational Software Conference 2009
Background
 RTC 2.0 Product Goal
 Deliver global enterprise readiness which
includes enhanced scalability and high availability
amongst several other features.
 Performance Team Goal
 Help validate scalability of the RTC Server and
provide input into customer collaterals.
 System Verification Team (SVT)
 Rational Performance Engineering Team (RPE)
IBM(R) Rational Team Concert(TM) System Performance Testing CRM30
IBM Rational Software Conference 2009
Test Strategy: Establish Baseline
 Obtain realistic baseline for server workload
 Jazz.net used by agile development team familiar with Team Concert features
 Server reports web service counters of requests from all users
 User actions on client translate to one or more service calls to server
 Counters record total calls, response times, bytes sent and received per service call
 Establish simulation target total calls by scaling up baseline service calls
 Focus closely on most frequent and time consuming services
RTC Web Service Counters
IBM(R) Rational Team Concert(TM) System Performance Testing CRM30
IBM Rational Software Conference 2009
Test Strategy: Develop Automation
 Rational Performance Tester (RPT) v 8.0
Work Item
 Build test harness to simulate user
transactions
RPT Record-and-Playback
 Develop TeamConcert test harness
covering key user transactions
 HTTP record-and-playback for Work
Items, Reports, Iteration Plans
 SCM operations using IFileRestClient
API
 Builds using Ant script to load
workspaces, publish status/results
 Feed queries using API calls
 Generate load across multiple client machines
 Multiple clients at high speed to simulate
many users
HTTP
Source Control
Build
IFileRestClient API
Ant
HTTP
HTTP
Feeds
Feeds API
HTTP
IBM(R) Rational Team Concert(TM) System Performance Testing CRM30
IBM Rational Software Conference 2009
Test Strategy: Measure Server Load
 Measure and calibrate the load the test
harness generated.
 Compute simulated users by
comparing to baseline rates
 Ensure response times reasonable
and consistent
 Compare average response times
to Jazz.net baseline
 Track trends in response times for
consistent performance
0.4
0.35
Time in Seconds
 Record counters hourly during test
run
Web Service Count Trend Analysis Sample
0.3
0.25
0.2
0.15
0.1
0.05
0
1
2
3
4
5
6
7
8
Samples in hours
scm.common.IScmService.createBaseline()
workitem.common.internal.rest.IWorkItemRestService.getEditorPresentation()
IBM(R) Rational Team Concert(TM) System Performance Testing CRM30
IBM Rational Software Conference 2009
Test Strategy: Collect Metrics
 Client metrics
Transaction Counts from low level framework
RPT Average Response times
 Traffic metrics
 Web Service Counters
 Average response time
 Byte sent /Byte received
 Resource Metrics
% CPU Utilization
 Memory Used
 Disk I/O
IBM(R) Rational Team Concert(TM) System Performance Testing CRM30
IBM Rational Software Conference 2009
Test Strategy: Lab Infrastructure
IBM(R) Rational Team Concert(TM) System Performance Testing CRM30
IBM Rational Software Conference 2009
Test Strategy: Repository Details
 Realistic Repository based on Jazz.net
snapshot, real data base.
 Repository has over 2 years history.
 Repository Details
 Total Size on Disk: 22 GB
 Total Size on Disk Uncompressed:
56 GB
Component
Name
# of Items
Size on
Disk in MB
SCM
887,566
2,717
Build
245,550
6,180
Filesystem
193,436
26,387
Work Item
62,812
3,917
 Total Size available to DB: 60GB
IBM(R) Rational Team Concert(TM) System Performance Testing CRM30
IBM Rational Software Conference 2009
Web Transaction Workload
Component
Test Name
Description
% Composition
Iteration Plan
Query Current Iteration Plans
The test queries for all current iterations plans
for Rational Team Concert project. Total count =
55
3
Workitem
Create and Save Query
This test opens, saves, and runs a query for
work items submitted by the members of the
test team area in the SVT Test Project.
10
Workitem
Create and Save Workitems
with Attachment
This test opens and saves a work item with a
125 KB attachment in the SVT Test
15
Workitem
Create and Save Workitems
without Attachment
This test opens and saves a work item with
comments
25
Workitem
Query Retrospectives
The test runs predefined query for
Retrospective work items in the Rational Team
Concert project. Total count = 85.
38
Workitem
Find Workitem via Search
This test searches a previously created work
item in the SVT Test Project.
7
Reports
Query Workitem Closed by
Iteration
The test runs a report for all work items closed
per iteration.
2
IBM(R) Rational Team Concert(TM) System Performance Testing CRM30
IBM Rational Software Conference 2009
SCM Transaction Workload
Transaction
Description
% Composition
Accept
Accept Change sets
20%
Deliver
Deliver Change sets
7%
Checkin
Checkin upto 5 files (2K-5K)
39%
Login/Logout
Login and Logout from the repository
<1%
Suspend/Discard
Perform suspend and discard RTC operation
2%
RefreshPendingChanges
RefreshPending Changes
8%
CreateWorkspaceFromStreamandLoad
Create a workspace from existing stream with 5000
files,50MB and load.
<1%
CompareStreamWorkspace
Compares the workspace against the current in the
stream using and the last baseline in the workspace
against the stream
4%
CloneWorkspace
Creates a clone of the existing workspace
<1%
CloseChangesets
Closes existing change sets
3%
Baseline
Creates a baseline
3%
Unload
Unloads the workspace
<1%
History
Performs history operation
4%
CompareWorkspaceBaseline
Compares the workspace against the last baseline in
the workspace
2%
Load
Unloads the workspace
<1%
IBM(R) Rational Team Concert(TM) System Performance Testing CRM30
IBM Rational Software Conference 2009
Scalability Observations in RTC 2.0
 Small Scale Enterprise Testing Results
100 – 700 users
 Large Scale Enterprise Testing Results
700 – 2000+ users
IBM(R) Rational Team Concert(TM) System Performance Testing CRM30
IBM Rational Software Conference 2009
Small Scale Enterprise Testing: Topology
 Application Server
 Websphere Application Server 6.1.0.23; IBM xSeries 336; Intel Xeon 5160, HyperThreaded EM64T, 2 Processor, 3.6GHz; 15,000 RPM SCSI Drives; 4GB RAM
 DB Server
 DB2 9.1 FP4; IBM xSeries 3550; Intel Xeon 5160, Dual Core, 2 Processor, 3.0
GHz; 10,000 RPM SAS Drivers; 4GB RAM
 Both servers running Windows 2003
 Authentication: LDAP (Microsoft Active Directory)
Application
Server
Instance
RTC
Application
Database
Server
Instance
Repository
Database
Users
Two-Server Configuration
IBM(R) Rational Team Concert(TM) System Performance Testing CRM30
IBM Rational Software Conference 2009
Small Scale Enterprise Testing: Results
 Average Response Time for Web Service Counters:
 .338 seconds
 RTC Server Resource Utilization:
 Average CPU: 52%
 Memory used: 1.2 GB
RTC Server % CPU Utilization
100
90
Memory used in MB
80
70
60
50
40
30
20
10
0
0:00
0:28
0:57
1:26
1:55
2:24
2:52
TIme in HH:MM
RTC Server% CPU Utilis ation
*Pre Release Software Data
IBM(R) Rational Team Concert(TM) System Performance Testing CRM30
IBM Rational Software Conference 2009
Large Scale Enterprise Testing: Topology
IBM System x3650 M2; Dual CPU, Intel Xeon 5500, 2.4 GHz or higher, 64-bit
Memory - 18GB or higher
Operating System – Red Hat Release 5.3
Web server - Tomcat 5.5
Database - DB2 9.5 FP 4
Application
Server
Instance
RTC
Application
Database
Server
Instance
Users
Repository
Database
Single-Server Configuration
IBM(R) Rational Team Concert(TM) System Performance Testing CRM30
IBM Rational Software Conference 2009
Large Scale Enterprise Testing: Results
 Average Response Time for Web Service Counters:
 .199 seconds
 RTC Server Resource Utilizations :
 Average CPU: 47%
 Memory used: 18 GB
System Summary
CPU%
100
90
80
usr%+sys%
70
60
50
40
30
20
10
00:08
23:56
23:44
23:32
23:20
23:08
22:56
22:44
22:32
22:20
22:08
21:56
21:44
21:32
21:20
21:08
20:56
20:44
20:32
20:20
20:08
19:56
19:44
19:32
19:20
19:08
18:56
18:44
18:32
18:20
18:08
17:56
17:44
17:32
17:20
17:08
16:56
16:44
16:32
16:20
16:08
15:56
15:44
0
*Pre Release Software Data
IBM(R) Rational Team Concert(TM) System Performance Testing CRM30
IBM Rational Software Conference 2009
Key Scalability Results
Single-Tier Small
Enterprise
Configuration
Dual-Tier Small
Enterprise
Configuration
Single-Tier Large
Enterprise
Configuration
Dual-Tier Large
Enterprise
Configuration
(100-700 Users)
(100-700 Users)
(700- 2000)
(700 – 2000+)
Machine Type
IBM System x3650
M2 – Single CPU
Intel Xeon 5500 2.4
GHz or higher,
64-bit
IBM System x3550
Dual CPU Intel
Xeon 5160 2.4
GHz or higher,
64-bit
IBM System x3650
M2 – Dual CPU
Intel Xeon 5500 2.4
GHz or higher,
64-bit
2- IBM System
x3650 M2 – Dual
CPU Intel Xeon
5500 2.4 GHz or
higher, 64-bit
Memory
12 GB
4GB/8GB
18GB
12GB
OS
Windows
2003/RHEL 5.3
Windows
2003/RHEL 5.3
Windows
2003/RHEL 5.3
Windows
2003/RHEL 5.3
Application Server
Tomcat 5.5 or WAS
6.1.0.23 or higher
Tomcat 5.5 or WAS
6.1.0.23 or higher
Tomcat 5.5 or WAS
6.1.0.23 or higher
Tomcat 5.5 or WAS
6.1.0.23 or higher
Database Server
Oracle 10GR2,
DB2 9.1, DB2 9.5
FP 4, SQL 2005
and 2008
Oracle 10GR2,
DB2 9.1, DB2 9.5
FP 4, SQL 2005
and 2008
Oracle 10GR2,
DB2 9.1, DB2 9.5
FP 4, SQL 2005
and 2008
Oracle 10GR2,
DB2 9.1, DB2 9.5
FP 4, SQL 2005
and 2008
IBM(R) Rational Team Concert(TM) System Performance Testing CRM30
IBM Rational Software Conference 2009
Basic High Availability Testing
 Idle Standby is a failover strategy for basic
high availability. The backup server only
becomes active when the primary server
fails.
Users
HTTPS
 The Scenario consists of:
Plugincfg
2 WebSphere Application Servers (Version
6.1.0.19)
1 DB2 Server (DB2 9.1, FP4) populated
with complex 60 GB repository
HT
TP
1 IBM HTTP Server Web Server (Version
6.1)
IBM HTTP Server
Application
Server
Instance
Application
Server
Instance
RTC
Application
Primary Server A
RTC
Application
Backup Server B
*Async Tasks disabled
 Verify that the switch over is seamless.
Addition of the web server causes minimal
performance impact with 50 users at 1
transaction every 2 minutes.
Database
Server
Instance
Repository
Database
IBM(R) Rational Team Concert(TM) System Performance Testing CRM30
IBM Rational Software Conference 2009
Basic High Availability Topology
Application
Server
Instance
RTC
Application
TP
HT
HTTPS
Database
Server
Instance
Primary Server A
Plugincfg
Repository
Database
IBM HTTP Server
Application
Server
Instance
Users
RTC
Application
Backup Server B
Idle Standby-Server Configuration – Primary Server enabled
Application
Server
Instance
HTTPS
Users
Primary Server A
Plugincfg
IBM HTTP Server
RTC
Application
HT
TP
Database
Server
Instance
Repository
Database
Application
Server
Instance
RTC
Application
Backup Server B
Idle Standby-Server Configuration – Backup Server enabled
IBM(R) Rational Team Concert(TM) System Performance Testing CRM30
IBM Rational Software Conference 2009
Basic High Availability Performance: Results
4
3.5
3
2.5
2
1.5
1
0.5
0
ry
e
ry
ue
nQ
ru
i_
_w
an
eb
Pl
w
ct
le
se
m
p_
te
_i
kI
eb
or
w
W
w
ne
i_
_w
eb
w
TC
_R
in
og
_l
eb
to
is
in
el
as
w
H
B
d
oa
nl
er
iv
el
U
D
 Performance impact:
Most Web transactions
showed better
performance with the
web server.
SCM transactions were
mostly similar with or
without the web server.
Time in Seconds
HA Configuration Web Server Impact
SCM Client and Web Pages Comparison
With Web Server
Without Web Server
*Pre Release Software Data
IBM(R) Rational Team Concert(TM) System Performance Testing CRM30
IBM Rational Software Conference 2009
Large numbers of Contributor Users Testing: Results
 Simulated 1000 users @ 5 pages per
hour on Intel Xeon 5160 (Medium
scale user hardware)
 .227 seconds
 RTC Server Resource Utilizations:
80
70
% CPU Utilization
 Average Response Time for Web
Service Counters :
RTC Server % CPU Utilization
60
50
40
30
20
10
0
0:00
Average CPU: 36%
0:28
0:57
1:26
1:55
2:24
2:52
3:21
Time in HH:MM
Average Memory used: 998 MB
RTC Server % Processor Time
*Pre Release Software Data
IBM(R) Rational Team Concert(TM) System Performance Testing CRM30
IBM Rational Software Conference 2009
WAN Testing Performance: Results
 Response Time Comparison against a 250 user loaded server
LAN: < 5 ms latency
Slow latency WAN : 150 ms latency
 Simple File Copy Using Robo Copy of 323 KB file took 14 seconds in WAN.
LAN - WAN Comparison (SCM)
Time in seconds
6
4
2
0
R
C
sh
re
ef
n
gi
Lo
e
in
ck
he
ry
in
el
as
to
is
C
B
H
es
ng
ha
SCM Transactions
LAN
WAN
*Pre Release Software Data
IBM(R) Rational Team Concert(TM) System Performance Testing CRM30
IBM Rational Software Conference 2009
Conclusion
 Indicators of a highly loaded server
> 95% CPU over extended periods of time.
Time out errors in the Application Server logs.
Average web service response times trending upwards over a period of time.
 Key Results
A configuration like our Large Enterprise Configuration can support 2,000 users.
A configuration like our Small Enterprise Configuration can support 700 users.
Contributors don’t add significant load to the server.
WAN performance is good for most transactions.
Idle Standby solution with additional Web server adds only minimal overhead.
IBM(R) Rational Team Concert(TM) System Performance Testing CRM30
IBM Rational Software Conference 2009
IBM(R) Rational Team Concert(TM) System Performance Testing CRM3027
IBM Rational Software Conference 2009
© Copyright IBM Corporation 2009. All rights reserved. The information contained in these materials is provided for informational purposes only, and is provided AS IS without warranty of any kind,
express or implied. IBM shall not be responsible for any damages arising out of the use of, or otherwise related to, these materials. Nothing contained in these materials is intended to, nor shall have
the effect of, creating any warranties or representations from IBM or its suppliers or licensors, or altering the terms and conditions of the applicable license agreement governing the use of IBM
software. References in these materials to IBM products, programs, or services do not imply that they will be available in all countries in which IBM operates. Product release dates and/or capabilities
referenced in these materials may change at any time at IBM’s sole discretion based on market opportunities or other factors, and are not intended to be a commitment to future product or feature
availability in any way. IBM, the IBM logo, Rational, the Rational logo, Telelogic, the Telelogic logo, and other IBM products and services are trademarks of the International Business Machines
Corporation, in the United States, other countries or both. Other company, product, or service names may be trademarks or service marks of others.
IBM(R) Rational Team Concert(TM) System Performance Testing CRM3028