Community Performance Testing

advertisement
Sakai Community
Performance Testing
Working Proof of Concept
Sakai Performance Working Group
Linda Place, Chris Kretler, Alan Berg
Universities of Michigan & Amsterdam
1 July 2008
Performance Testing Overview
• Part of QA process
– May include stress, capacity, scalability, reliability
as well as performance (response time,
throughput, bottleneck identification)
• Black box testing
– Running system with projected use-case
scenarios for acceptable throughput and response
times
• How will users experience application?
Performance Testing Overview
• White box testing
– Pushing system to identify application, database,
operating system, or network problems
• Tune environment to identify and address specific
problems
• Tests watched by developers, DBA, system and network
administrators, and performance engineers
• Resource intensive nature of process
– Infrastructure, personnel, tools, time
Community Performance Testing
• The Approach
• Common Environment/Data Creation
• Load Test Tool
• Proof of Concept
• The Practice
Organization vs. Community
Testing
• Organization approach
– Advantages
• Focus on exact infrastructure used in production
• Define use cases according to real use in production
• Only test tools specific to organization
– Limitations
• Very resource intensive (expensive)
• Hard to maintain
• Hard to be agile without careful advance planning
5
Organization vs. Community
Testing
• Community approach
– Advantages
• Pool of test scripts available for immediate use
• May not need full testing infrastructure to be confident in
production
• Increased confidence in adding new tools
• Total cost of testing shared
– Limitations
• Must clearly communicate results to community
• Collaboration seems harder than just doing it yourself
6
WG Project Objectives
• Create QA environment that enables
performance testing by community in shared
manner
– Plan available on Confluence
• http://confluence.sakaiproject.org/confluence/display/PE
RF/Home
• Have working proof of concept by Paris Sakai
Conference
7
Infrastructure Issues
• What do we learn from testing done on
infrastructure that doesn’t match our own?
– Advantages
• Software/application focus forces code improvements
• More efficient code more likely to scale to meet size and
performance needs
– Disadvantages
• Hardware and network bottlenecks NOT found
• Capacity limits of environment NOT discovered
Community Performance Testing
• The Approach
• Common Environment/Data Creation
• Load Test Tool
• Proof of Concept
• The Practice
Provisioning
• Alan's provisioning tool
– Perl scripts use web services to populate DB with
courses
– Configuration via property files
– Adds tools, instructors and students to courses
– Creates file structure and uploads resources to sites
• Other efforts are underway
Provisioning, Benefits
• Easy, fast creation of test environment
• Common data environment
– Common environment by sharing properties files
– Common data files for tests
Provisioning, Some Modifications
ORIGINAL
MODIFIED
1. provision.pl
1. provision.pl
2. make_media_list.pl
2. make_media_list.pl
3. make_local_resources.
pl
3. create_templates.pl
4. load_resources.pl
Data Environment Design
• Based on UM Winter 2008 semester
– Number of courses (2000) and students (20,000)
– Tool mix (+projection)
• Courses broken into 5x3 categories:
– 5 categories of roster size
– 3 categories of tool distribution
• Site name based upon category
Community Performance Testing
• The Approach
• Common Environment/Data Creation
• Load Test Tool
• Proof of Concept
• The Practice
Evaluation & Tool Selection
• Goal
– Enable near-enterprise level of tool quality using
Open Source tools
– Hewlett Packard LoadRunner used for comparison
15
Evaluated Test Software
 Grinder
 WebLoad Open Source
 JMeter
 TestMaker
 CLIF
 OpenSTA
 Web Application Load
Simulator
 Hammerhead
 SlamD (Sun)
 Funkload
 Selenium
 Pylot
 httperf
 Seagull
 Deluge
Grinder: Plus
• Easy record mechanism
– Can record https on multiple platforms
• Like the scripting language (jython)
– Java extensible
– Allows for re-usable libraries
– Flexible reporting, data handling
• Run-time UI displays multiple test scripts
– We use 22 for our standard scenario!
Grinder: Plus
• Distributed load generation
– Multi-platform
• Active user/development community
– Open source
– Separate project for developing reports: “Grinder
Analyzer”
• Jython Eclipse plug-in (Grinderstone)
Grinder: Minus
• Default recorded script is complex
– Verbose results data
– Perl script cleans the default record script
• (J)Python not ubiquitous
– To do list shows support for java scripts
• Reporting project doesn't consolidate data easily
Community Performance Testing
• The Approach
• Common Environment/Data Creation
• Load Test Tool
• Proof of Concept
• The Practice
Proof of Concept
• Read-only scripts used
– Download file
– Viewing a grade
– “Idle” user
• Mix/Frequency of users corresponds to a
peak usage period
– Events table used to determine usage
Event Graphs
LoadRunner Script
Grinder Script
LoadRunner Results
Grinder Results
Database Results
Grinder: Some Ideas
• Consolidated & Expanded reporting
• Business Process Testing
– Makes test script development more flexible
– Potentially expand the pool of testers
Community Performance Testing
• The Approach
• Common Environment/Data Creation
• Load Test Tool
• Proof of Concept
• The Practice
Establishing Baseline(s)
• Identify “core” scripts for Sakai baseline
• Identify related script clusters to baseline tool
clusters
• Maximum time threshold for page loads
• Minimum user load for tools and kernel
• Database activity reports
Establishing Baseline(s)
• Identify “core” scripts for Sakai baseline
– All tools included in release?
– More limited subset?
• Identify related script clusters to baseline tool
clusters
– Tools with dependencies?
– Tools with similar actions?
– Usage scenarios associated with specific
campuses?
32
Community Sharing
• Scripts
–
–
–
–
Set up performance test branch in Sakai svn?
Associate test scripts with code in svn?
Expand Confluence Performance WG site?
Establish communication mechanisms during QA
release cycles?
33
Community Sharing
• Scripts
– Sakai performance test branch in svn?
– Packaged along with tools in svn?
• Results
– Expand Confluence site?
– Contribute to QA “score card”
• Documentation
– On confluence and/or in svn?
Contributing to Sakai QA
• Recommend performance “standards” to be
part of formal QA release process?
– Score cards
• Volunteer organization to test specific tools
based on usage needs
– Use shared scripts and contribute scripts/results
back to community
• Combined test results yield greater
confidence in Sakai code
Moving from Concept to
Community
• Performance Working Group BOF
– Wednesday, 11-12, BOF Room 1
• Check BOF schedule for exact location
• Looking to identify contributors
• Goal to establish new working plan for WG
contribution to next conference
Michigan’s Contribution
•
•
•
•
Leadership & initial design
All test scripts used at Michigan
All results from Michigan tests
All documentation needed to conduct testing
as currently implemented at Michigan
• All relevant materials for future tests
• Kernel testing for Next Generation Sakai
Download