Qualitest Group Automates Testing for a Financial Company
Qualitest automates testing into the DevOps flow, enabling rapid test development and on-demand testing.
The client specializes in active global and international equity strategies, employing sophisticated analytical modeling for active stock selection. Their proprietary database exceeds 40,000 securities across 100+ markets worldwide. Utilizing their extensive research capabilities, customized investment management solutions are developed for their clients.
Business Needs and Objectives
The software systems were undergoing major changes. The regression suite was very large due to the complexity of the algorithms used in the business scenarios. Due to time constraints to deliver on time while supporting the business in many instances, the full regression could not be completed with resulted in defects ending up in their production systems.
The client moved from an agile development SDLC to an agile/DevOps SDLC. The mandate was to provide a fully automated regression suite that would be run as part of the nightly CI build and to run a full regression suite within 1 day on the “QA” build. Doing this required a complete change in how QA and Development interact putting design agreements into place, moving from a manual configure and deploy process to automation, building out a new automation framework and delivering automation “in sprint” so tests could be kicked off by the CI pipeline.
The Qualitest Solution
There are multiple parts of the solution which QualiTest architected for the client.
- Create the continuous integration pipeline
- Code and Script Development
- QualiTest KDT Automation Framework
- Deliver in sprint
- Dev/QA agreements
- Train team
After engaging with the customer to understand their challenges and where improvements can be made, QualiTest architected a Continuous Integration pipeline. Jenkins is used to orchestrate the entire flow. Jenkins manages the critical version number that is applied to all artifacts. It also handles passing variables from one step to another. The Qualitest QT Dashboard tracks each and every phase of the process providing the necessary metrics the management team.
Code and Script Development
Development and QA work out of Microsoft Visual Studio where user stories and tasks are tracked. TFS also acts as the code repository for all development code as well as the QA Automation Framework and the automation code itself. All QA activities have tasks associated with them so progress can be tracked. The test suites are built using Microsoft MTM which allows execution progress to be tracked.
At build time, Jenkins uses TFS to create the build package for the servers. Jenkins also builds via ANT and assembles the necessary test automation preparing it to be downloaded to the execution servers
AWS CloudFormation orchestrates and provisions all the resources necessary to build both the environment and the test execution servers. Puppet is used to deploy the code onto the environments.
Prior to any tests being run, ServerSpec verifies the servers. The verification ensures that all servers are configured correctly and makes sure all servers are in the correct state prior to the start of the test automation. The automation tests utilize Qualitest’s KDT Automation Framework.
Qualitest KDT Automation Framework
A test automation framework is an integrated system that sets the automation rules for a specific product. This system integrates the function libraries, test data sources, object details and various reusable modules. These components act as small building blocks which can be assembled to represent a business process. The framework provides the basis of test automation and simplifies the automation effort.
The framework (a hybrid driven by Keywords and Data) provides a complete end-to-end test automation solution for QA. The framework is mainly based on the three-tier architecture: Data layer, KDT Engine and the Common Utilities.
The KDT Engine is responsible for initiating the test and for interacting between the test automation tool and the application. The KDT engine contains a KDT Driver which mainly executes the keywords and generates the result of the keyword and test case. It also has many utility scripts like database handling, file system handling, etc.
The Keyword library contains all the application specific operations or instructions grouped together to form keywords. The test cases constitutes a set of keywords based on the test functionality.
The test execution starts with a java file selector pop-up which is a script that invokes the Core KDT Engine (Keyword Driven Test Engine). The test engine initiates test script execution and simultaneously captures the logs. When the test script module encounters a low-level command for a specific component, it determines the component type.
Keywords are the main functionality of the KDT framework. Each keyword represents a single logical step in the testing process. Test scenarios are comprised of multiple keywords, executed sequentially. Thus, keywords should neither be too simple nor too complex. If keywords are too simple, tests would be boring and repetitive to write (ex. “Click on this field”, “enter this text”, “click on the next field”, “enter more text”, “click this button”, etc.). Conversely, if keywords are too complicated, they would not be easily reusable, and you would have to create a keyword for each situation you wish to test.
Thus, keywords should strike a balance between generality and specificity. Good example keywords are: log in to an application; fill in a section of a form with interrelated information (e.g. Username, Password etc.); Clear/Cancel/Save a form.
After each test case is completed, a report is generated. This shows a detailed description of the test’s execution flow, showing which keywords executed successfully and which ones failed (including a screenshot). The report is stored in the reports folder, and a link to it is stored in the test case file.
Delivering Automation In-Sprint
In order for the QA team to be able to deliver automation in-sprint, the following agreements were made with Development and the project team:
- User stories are made available and clearly laid out before the sprint open/close meeting. This helps the test team come up with good questions to ask at the sprint open/close and plan for it.
- Mock-ups/HTML of the UI design are made available from the product design team to dev and test team at the same time, with which a tester can design their automation test case (pseudo code or the layout of the keyword can be designed).
- When dev team completes the UI integration with back-end, the DOM elements will be made available to the tester to complete the test case (the tester should work closely with dev team, so that when the Xpath or the identifier of a page object is available, the QA team can make the changes immediately).
- Page object identifiers/Xpaths must be implemented in a way that all/most of the page objects must be defined uniquely. This avoids unnecessary re-work on the automation when a new feature is added in the subsequent builds/sprints.
- Appropriate automation tests are identified and plugged into the continuous integration pipeline. If the automation test suites are configured to run after each deployment to QA/Dev environments on multiple browsers and on different machines on cloud, this saves a lot of time, helps in identifying the stability of the build at an early stage and also by using the cloud, one can bring up and down multiple instances for automation whenever required.
Train the Team
In order to keep the domain knowledge intact, we decided to use the QualiTest KDT Framework. This allows the core automation team to develop the keywords while the testers without automation experience can create the datasheets that drive the tests. Training was complete with the manual testers to ensure they knew how to properly populate the spreadsheets, make the tests part of the CI pipeline and to debug the tests when errors were reported.
By utilizing the Qualitest Solution, we were able to migrate our client’s 86 applications into DevOps, enable manual testers to leverage automation, create and run full regression within every sprint and reduce the number of production detected defects to near zero. By using the agreements outlined above and the QT KDT Framework, the QA team was able to stay in sync delivering automation tests alongside with code in the sprints. The end result after the first quarter’s 423 releases to production was not a single rollback and only 3 hotfixes required. Overall production-related medium and high severity defects were eliminated and the low severity defects were dramatically reduced allowing the technical debt of the backlog to be reduced to a manageable level. By allowing testing to be performed early and shifted left, Development was able to quickly bug fix the code making on-time-delivery a reality for the organization.