Client Overview

The client is a leading provider of innovative solutions in the e-learning industry. They have transformed the learning experience by delivering digital information services, course–driven training programs and content for instructors, students and other academic bodies. This transformation has facilitated their support and advancement of the knowledge, teaching, learning, research and service that is fundamental to higher education.

Business Needs and Objectives

The challenge with this large scale online e-learning system was the achievement and maintenance of its mission-critical application at peak performance and scalability levels. The application capacity was also an unknown risk because the system did not contain any solid, well-framed methodology for predicting system behavior and performance under real-time stress.

The application encountered multiple challenges in setting up a valid environment and tests, such as:

  • Bulk test data – The test data required to simulate the vast amount of login IDs involved a tremendous amount of planning, structure and data setup. QualiTest devised a solution to create over 100,000 student accounts, 2,000 instructors, and over 2,000 classes. Each class contained multiple sessions, multiple types of content and multiple homework assignments.
  • Using the right tool – The tool choice for performance testing is critical when testing complex applications. Many open source tools can submit the same requests as the more expensive commercial tools, but can’t handle the dynamic nature of testing an online class with multiple simultaneous protocols or the real-time interactions between two different types of users on the system. QualiTest used one type of tool to generate the static background load on the system and one type of tool to handle complex scripts. This approach saved the client money, as they didn’t need to purchase additional load generation licenses to handle all of the traffic.
  • Asking the Right Questions and Extracting the Right Results from a Performance Test – Many clients are focused on ensuring that 90% of their pages load in some arbitrary amount of time. This method makes sense for testing an eCommerce site, but is not always the most comprehensive method for testing more complex applications. If the performance tester is focused on the page response time, he might miss other issues in the application.
  • Evaluation of Comparable Results – If the database size of the application under test is not constant across various test cycles, the results cannot easily be compared. It is also unreasonable to compare the results of a class of 300 students to a class of 2,000 students. Because the number of courses in the database was expected to grow rapidly, QualiTest and the client had to convene to understand the speed of growth and the archiving process.

The Qualitest Solution

The client was most concerned about scalability, performance and functionality under high loads, as well as assurance of complete data privacy.

Qualitest’s team of performance engineering professionals analyzed core elements to ensure that the system’s performance would meet or exceed the client’s goals and requirements, immediately and in the future. Our comprehensive consulting service aligned with the client’s Software Development Life Cycle to identify their key business processes and requirements, understand their business and infrastructure and assist in configuring a performance testing environment that accurately simulates production environment. QualiTest supervised the process with timely and efficient testing, monitoring, in-depth analysis and tuning recommendations.

In this client’s scenario, some of the key functions performed by the Qualitest engineers included:

  • Identification and development of an in-depth understanding about the client’s application and system architecture for the purpose of capturing only those processes with the greatest potential for success
  • Ascertainment for areas of performance risk, performance goals and requirements to achieve proper focus and avoid expensive over- or under-testing. This establishment allows us to identify potential performance bottlenecks early in the software development life cycle.
  • Creation of scripts and scenarios to produce consistent, measurable and repeatable load tests. These tests and scenarios are designed to mirror the client’s live production environment, user loads, business patterns and throughput, including projections for future growth when applicable. The Qualitest solution also includes performance tuning recommendations, designed to improve response times and avoid resource bottlenecks.
  • Analysis of trends to identify scaling issues that may impact users down the road. This includes the documentation of our testing process and archived test scripts, data, results and environment configurations.
  • Simulation of standalone functionality components. This is achieved by distributing scripting for each component in the application into an independent set of actions.
  • Design of Performance Unit Tests after thorough analysis of the application under development.
  • Usage of JMeter, stand-alone Java code and SilkPerformer

Key Benefits

Performance testing was conducted on the base version of the development platform to determine the baseline performance. The system was loaded with over 100,000 users and a series of resources, classes, homework assignments and gradebook entries. Concurrent user testing began with a small number of users and gradually increased to support more users. This process also helped to debug the test environment, fix errors in configuration and fine-tune the number of database connections, Apache HTTP workers and AJP processors.

Some benefits that resulted from these tests included:

  • Successful resolution of numerous performance glitches in the code. This was the result of working in close association with development teams.
  • Optimally determined configuration of various components of hardware servers, including implementation of cache servers in the deployment model.
  • The Load tests were run through the various stages of the agile development cycle. These test were run against each separate function as it became available and against each class type as it was developed, with performance numbers reported for every deployment. By continuously improving and meeting milestones, summer and spring releases were launched on time without any glitches.

Database transaction rates and Java memory management received particular scrutiny in order to deliver the response times that were required. Significant rework of the application, database schema and production environments resulted in significant performance improvements, savings in production costs, and increased end-user satisfaction.