Insights Blog Qualitest Performs Load Testing for a Leading e-Learning Provider

Case Study

Qualitest Performs Load Testing for a Leading e-Learning Provider

How do you confirm that your giant e-learning system handles peak loads? Qualitest helps you pinpoint your system weaknesses as you scale up.

Client Overview

The client is a leading provider of innovative solutions in the e-learning industry. They have been serving and delivering content, digital information services and course–driven training programs for instructors, students and other academic bodies that has transformed the learning experience.

Business Needs and Objectives

One of the biggest challenges the organization faces is achieving and maintaining its mission critical application at peak performance and scalability levels. The system without any solid, well framed methodology for finding or predicting system behavior and performance under real time stress was really hurting. It was also facing problems like:

  • Unavailability of bulk test data: Generally enough login IDs were not available in the application database to run performance tests with multiple users
  • Complex and hard to analyze results: Improper confusing charts and tables often resulted in wrong decisions affecting the client’s and product’s reputation.
  • Load generation tools used did not interact with client side portion of the application. Conditional navigation was not properly handled.
  • The database size of the application under test was not kept constant across various test cycles which led to difficulties in comparing results.

The Qualitest Solution

Performance testing is the process of building a product with superior performance and scalability. At Qualitest, our team of performance engineering professionals understands core elements to assure the system’s performance, meets or exceeds its goals and requirements, now and in the future. Our comprehensive consulting service aligns with the client’s Software Development Life Cycle to identify their key business processes and requirements, understand their business and infrastructure and help configure a performance testing environment that accurately simulates production environment. We follow up with timely and efficient testing, monitoring, in-depth analysis and tuning recommendations. Some of the key functions performed by the Qualitest engineers in this client’s scenario include:

  • Identified and developed an in-depth understanding of the client’s application and system architecture, to capture only those processes with the greatest potential for success
  • Scoped out areas of performance risk, performance goals and requirements to achieve proper focus and avoid expensive over- or under-testing, allowing us to identify potential performance bottlenecks early in the software development life cycle.
  • Created the scripts and scenarios to produce consistent, measurable and repeatable load tests; designed to mirror the client’s live production environment, user loads, business patterns and throughput, including where applicable; projections for future growth. Made performance tuning recommendations designed to improve response times and avoid resource bottlenecks.
  • Performed trend analysis to identify scaling issues that may impact users down the road. Documented our testing process and archived test scripts, data, results and environment configurations for future testing and base lining.
  • Simulated standalone functionality components, by breaking scripting into independent set of actions for each component in the application.
  • Performance Unit Tests designed after thorough analysis of the application under development.
  • Used JMeter as the scripting tool, with stand-alone Java code also written for modules to be tested.

Key Benefits

Performance testing was conducted on Sakai 2.5.x (revision was from the branch between 2.5.0 and 2.5.1) for the purposes of determining the baseline performance of Sakai.  Testing was done with a basic set of tools configured in each worksite.  The system was loaded with over 90,000 users and a series of announcements, resources and gradebook entries.  Concurrent user testing began with a small number of users and gradually increased to support more and more users.  This process also helped to debug the test environment itself, fixing errors in configuration and fine tuning the number of database connections, Apache HTTP workers and AJP processors.

  • Successfully resolved numerous performance glitches in the code working in close association with development teams
  • Optimally decide various components of hardware servers including implementation of cache servers in the deployment model
  • Load test run through the various stages of the agile development cycle, with performance numbers reported for every deployment. With continuous improvements and meeting milestones, summer and spring releases were launched on time without any glitches

Special attention to database transaction rates and Java memory management was required to deliver the response times required by the client. Significant rework of the application, the database schema, and production environments resulted in significant performance improvements, savings in production costs, and increased end-user satisfaction.