Qualitest’s Time Zone API Testing Keeps Everything on Schedule
When a compliance company needed to confirm that time zone tracking would audit correctly, we found ourselves in the zone and with our tests running like clockwork, diving deep into localization testing.
The client is a corporate compliance and risk management consulting company. They support banking and tax audit efforts regarding companies worldwide, that use workflow management web forms which must be properly tracked for record keeping and auditing purposes through time stamps. The synchronization between users requires signoff dates to match. The workflow aspect of their software would break if users could not signoff the forms with proper date/time information. Keeping accurate records of timestamps is crucial too.
The client provides their consultants with pre-packaged and custom-tailored software packages. The developers had made their own GRC (Governance, Risk, Compliance) solution. The software is mostly web-based but Excel Add-Ins are also developed. The web-based software architecture follows a 3-tiered .NET pattern, with SQL on the back-end. The out-of-the-box components behave as workflow management software with reporting capabilities, dashboards, bulk editing of records, and customization of web-forms.
Business Needs and Objectives:
Prior to Qualitest’s engagement, many issues were being reported in production. Testing was being performed by financial, corporate compliance (non-software) consultants with little testing knowledge. This also took time away from their core business interactions with clients (big banks, media corporations, various large institutions). The development team (5 developers) loved programming but had little interest in testing.
Much of the testing required proper usage of multiple time zone APIs for logged events, records, batch record editing and workflow signoffs, requiring testing of all global time zones. This kind of localization testing includes considering different times of the year, leap years, holidays and daylight savings times (which may vary between countries even in the same time zone). Workflow management web forms were being used worldwide and required proper tracking of their timestamps for record keeping and auditing purposes.
The Qualitest Solution
We were largely on our own to self-manage and determine what was needed. The developers and the compliance consultants, while very good at what they do, lacked interest and background in testing. Much of the need involved the writing, design and execution of functional tests including use of a bulk Excel editing tool. But first, we had to learn how things work (which we documented, noting the lack), and formalize a standardized test suite based on IEEE test design templates (so that we’re not reinventing the wheel every time). Once we greatly expanded the scope of coverage, more bugs started pouring in. After these got fixed, the bug count fell back down to a more manageable level.
Due to the minimal test documentation at project launch, face to face requirements gathering was necessary. Our method of system breakdown, which our lead tester had learned directly from our CEO Yaron Kottler, provided a firm foundation that had been lacking, and which the client loved. System-wide test coverage was documented and shared with the development teams. The coverage was effective enough that it was used by developers to coordinate their development tasks as well for two releases.
Testing processes improved over time (improvements were proposed and implemented incrementally). One of the largest contributions was the increased test coverage, leading to more exposure into the client’s software and the uncovering of defects. Testing coverage continuously grew as a result of confidence in testing methods. The test approach was well documented and issues found during testing were properly submitted. Qualitest was introduced to more and more features to test over time, even the most difficult ones, such as the GRC solution and testing an in-house “custom fields” language with the ANTLR grammar as a test reference. Testing was started earlier and earlier in the development cycle. Eventually tests were being written before any development was available for testing.
Let’s look at some of the test variables used for time zone testing:
- All locations (there were usually multiple locations per time zone; an early test plan shows 114 locations, which were each assigned a test user)
- All web forms (they often contained multiple timestamps)
- Mobile devices (different versions of iPads and iPhones)
- Browsers (multiple versions of Internet Explorer, Safari, Chrome and Firefox)
- Achieved End-To-End test coverage on every supported browser and mobile
- The software delivery cycle was fast paced and right-shoring helped to maintain their schedule
- Reusable documented test coverage of every major, minor, and custom feature
- Suggested minor tracking tool improvements which were implemented, along with major improvements to be implemented long-term
- Testing responsibilities transitioned from non-testing personnel to us as sole testing entity over time; testing workload continued to grow since project inception, as did testing coverage
- Complete test documentation sets were standardized and improved upon over time; the documentation set’s requirements were agreed upon and tailored for the client’s needs
- Tests were written for batch tools to quickly increase the scenario count; batch tests were documented for reuse and regression testing, and test scripts were developed and reused across test cycles
- Testing began at the same time as development and often even beforehand, identifying necessary changes and issues earlier in the delivery cycle and avoiding project delays
- Improved test harness; designed, implemented, and reused test setup configurations and input data
- Throughout the projects, coverage was vastly improved
- Identified several performance bottlenecks
- Issues were discovered early, many of which were browser dependent; cross browser coverage reduced production bugs
“When you design better tests, learn the software better and increase test coverage, you find a lot of issues that you wouldn’t even think are there. And if you don’t test for those, the users will find these types of things. We’re providing value.”
Lewis Sall, Project Lead, Qualitest Group