Validata Performance Tester Deliver flexible, scalable performance testing across platforms Agenda 1 Business Challenges 2 Validata Advanced Testing Suite 3 Validata Testing Approach 4 Validata Testing Methodology 5 Validata Performance Tester Overview 6 Benefits 7 Validata Performance Tester Case Study Business Challenges Testing the performance of web-based applications can easily miss the mark. It’s easy to design unrealistic scenarios. Easy to collect and measure irrelevant performance data. And, even if you manage to design sound scenarios and collect the right data, it’s easy to use the wrong statistical methods to summarize and present the results. Traditional performance testing approaches, involve performance testing teams very late in the implementation lifecycle. Furthermore, applications are tested and tuned at the latest stages of the project, whereas business needs are not successfully met due to the constant environment changes. Therefore deep, flexible and efficient testing coverage can not be achieved with traditional testing tools. Validata Advanced Testing Suite (ATS) Validata Advanced Testing Suite (ATS) provides a full end-to-end automated testing capability that adapts easily to changes in the application under test, ensuring higher quality and reduced costs and effort. Validata ATS is a truly integrated and business process management solution. Validata ATS is the first model – driven test automation tool for Functional, Technical and Continuous Regression Testing. 30% increase in productivity Validata focuses on the analytics (the context and the content) thus providing root cause 38% increase in assets analysis linking requirements and testing. Full re-use reporting is on-demand from the Executive Dashboard Module. 20% increase in project success Project Success 80% reduction in current time to test 80% reduction in current spend on testing 50% faster time to Market Validata ATS Benefits Efficient Testing • Reduced Testing time - Less time to develop, Shortened application life cycle and Faster time to market • Reduced QA Cost - Upfront cost of automated testing is easily recovered over the lifetime of the product. The cost of performing automated testing is much lower, many times faster and with fewer errors Improve Process •Consistent test procedures - Ensuring process repeatability, resource independence, eliminates manual errors •Replicating Testing - Across different platforms is easier using automation •Results Reporting - Automated testing produces convenient reporting and analysis with standardized measures allowing more accurate interpretations Effective Testing • Greater Coverage -The productivity gains delivered by automated testing enable more and complete testing. Greater coverage reduces the risk of malfunctioning or non-compliant software • Improved testing productivity - Test suites can be run earlier and more often Better Use of resources •Using Testing Effectively -Testing is a repetitive activity. Automation of testing processes allows machines to complete the tedious, repetitive work while resources are diverted to perform other tasks Test team members can focus on quality Validata Testing Methodology Testing Techniques: Model Driven Data Driven Key Word Driven Product Overview: Test Types Validata Performance Tester fulfills the needs of organizations for performance testing of web- Functional Testing based applications. It is fully integrated with Validata ATS, incorporates SWIFT, ARC IB and other Internet Banking is designed • Userapplications Acceptance and Testing (UAT) to deliver a faster and more cost-effective System Testing (SIT) of critical IT systems. The T24 specialized approach to• test the Integration reliability and scalability • Interfaces Testing • Message Testing (MSG) adapters and the pre-built test scenarios library accelerates the performance testing element of the project by 75%. Technical Testing Objectives of Performance Testing: • Unit Ensure thatTesting the system provides adequate response times (verify performance requirements) Performance Tester • Performance Testingof concurrent users (current system capacity) Determine maximum number Meet end-user expectations Continuous Regression Testing configuration Determine optimal hardware and application Identify performance bottlenecks Verify systemParallel Testing on multiple environments using the Validata ATSthe hasscalability the abilityoftothe perform unique test engine adapter Assess the impact of any hardware or software changes on site performance, new features, or functions Types of Performance Testing Expected number of users with average user interaction times, over short period of time, and load conditions that will occur in a live production environment. Load Testing Focuses on: Number of users accessing the server Combination of business transactions that are executed Impact on different environment components Worst-case scenarios for a short period of time Stress Testing Focuses on: Locating the point at which server performance breaks down Steadily increasing the number of simulated users until a breaking point is reached Identifies performance issues that might not otherwise be seen Verifies that web site/application will perform as expected under peak conditions Scope of the Performance Testing On Line Testing Batch Offline Testing COB Testing (Daily, Monthly, Quarterly) Socket Based Transactions (Interfaces) ATM & POS & Mobile Browser Based Transactions (http) Module Executions Mixed Executions IB Application Transactions (http) Module Executions Mixed Executions Batch File Transactions Report Generation and Interest Accruals Account Statements Noise Transaction Generation (T24 Browser) Validata Performance Tester Methodology Requirements Analysis Indentify the required stake holders, business analysts, Infrastructure managers Organize and gather the business requirements Convert the business requirements into performance requirements and metrics Run workshops for knowledge transfer Test Planning Collect the business critical transactions Determine the required test volumes Prepare entry and exit criteria Prepare the schedules for testing and testing estimations Check infrastructure availability Test Design Identify the pre – test and post – test procedures Determine the test customization requirements and prerequisites Isolate monitoring requirements and metrics to be collated Create and Review the test cases Create and review the workload scenarios Test Execution Reporting Execute Smoke Testing Setup the required environment monitors Execute the test and collate the results Share the test results with the Project Team Schedule the next execution cycle after the resolutions of the issues Correlate Test results from different test cycles Prepare the test summary document Test Summary presentation to the stake holders Sign Off Validata Performance Tester Features Create cycles per transaction or group of transactions Mix of critical transactions for performance testing Pre-built performance test cases Execution of cycles with Validata pre-built T24 adapters Clone of cycles for multiple executions Validata Performance Tester Metrics Transaction Based Metrics Server Based Metrics Metrics to be Captured Comments Metrics to be Captured Comments Throughput (per Sec) Transactions per sec CPU Usage Response Times / Elapsed Times Time taken to process a transaction Memory Usage User% System% Idle% Wait% Logical CPU Used% Used in GB Memory available Types of Errors Totals for different types of errors for a particular test Disk I/O Disk Read KB/sec Disk Write KB/sec IO/sec Count of Errors Total errors for a particular test. Network Activity MB/sec Packets/sec Transaction Count Total transactions processed for the time period Size of packets Bandwidth used Validata Performance Tester Reporting Transactions per Second & Requests per Second Hard Disk Usage Errors Count Available Memory Validata Performance Tester Case Study With a network of over 40 branches and many Internet Banking customers, Mauritius Commercial Bank (MCB) required a robust environment to continue to provide to their customers the level of service enjoyed prior to the implementation of Temenos T24. As such they identified a need for a performance testing tool to assist them with validation of the configuration of the environment. Challenges Aggressive project plan Performance testing of a consistently changing Solution Outline environment Need of a performance testing where the test cases could easily be updated to reflect the new environment. Developed a performance testing strategy and plan including all aspects of environment testing Designated their resources and collaborate with those of the bank and Temenos, to manage the process, development and execution of the tests Deployed Validata Performance Tester solution to produce all management reporting for project progress and defect management Delivered the full solution on a fixed fee basis Benefits Realized Easily manage from start to finish all performance testing processes. Identify design issues and performance bottlenecks and overcome them. Effectively concentrate their resources, by exploiting the product and resources encompassed by Validata. Efficiently manage monthly costs for testing Enjoy a cost efficient solution that incorporated both resource and product Validata Performance Tester Case Study Distributions and Scalability Executions Scalability from 50 to 2500 Virtual Users 1 Hour Continuation of Executions Executions per Modules Transaction Mixed Executions The Critical Differences Achieve full test coverage Decrease total time of performance testing up to 60-70% Maximum reusability on the tests assets with minimum effort to maintain them Script less creation of scenarios, achieving 100% automation Truly de-skilled reducing the turn around time by 50% On the Cloud: Remote access & Multiple site support Less time to prepare, faster time to market by 50% Do You Have Any Questions? We would be happy to help.