Client Overview

The Met office are a Trading Fund of the Ministry of Defence (MOD) monitoring and predicting weather systems across the globe. They are a commercial organisation focused on scientific research and providing one of the best weather services in the world.

Business Needs and Objectives

The Met office was exploring the possibility of outsourcing part of their testing and potentially using offshoring as a method for achieving this. Following a discussion with Qualitest it became apparent that the logistics of off-shoring would not be feasible as it meant sending their development off-site and this was not something that was going to be viable.

The Qualitest Solution

As an alternative and cost-effective method, Qualitest proposed the use of one of our crowdsourcing partners. This provides access to a global network of testing experts, currently over 20,000, through which within a fixed time and budget the testing of your development is opened up to targeted testing specialists. These testers are graded and profiled so that it is possible to choose exactly how and where to test your application.

The Met Office was interested in the breadth and diversity of the community as well as the cost effectiveness that is offered by the partner through Qualitest. Qualitest currently run events (www.zappers-community.com) where teams compete to find bugs within a limited time (usually an hour) using the crowdsourcing platform.

These events are open to all within the testing community but to help the Met Office experience this in a real environment a closed event was hosted for them by Qualitest, pitching the Met Office Team against the world wide community. This event enabled their whole development team, including testing, to evaluate the benefits that crowdsourcing could bring to the entire process. Each team had a mix of analysts, developers and testers so that there would be an opportunity for knowledge share amongst team members.

Execution

The event focus was on testing the final stages of one of the Met Office’s new web applications. The release was opened to the wider crowdsourcing community as well as the Met Office event attendees, this enabled them to see the speed of bug finding across the globe and the quality of the bugs found. The Met Office were impressed when the crowdsourcing community found more bugs than they had expected in just 2 hours. In addition, many of the bugs found by the crowdsourcing community had not been found by the Met Office team, they found different things, which shows the benefit of a “fresh set of eyes” in testing.

The Met Office described the value of crowdsourcing in terms of; speed of defect detection and the ability to find different defects which may otherwise have gone undetected. 65 defects were detected in 2 hours for the equivalent cost of 3 full time testers working for 1 day, or 1 full time tester working for 3 days.

“Because you created a small team of people from different working groups and different teams and put them together in a competitive environment that, as a model, worked well.”
Bob Doubell, Senior Tester, Met Office

Each team was given the Zappers rules and the current bug list so that no duplicates were raised. The event highlighted some very interesting and unforeseen benefits about the rejection process and relating the uncovered bugs to the specifications and testing requirements. This is a key factor and highlighted the importance of testing requirements.

Dave Underwood, Deputy Director of IT Services at the Met Office commented on the importance of a joined up approach to testing by saying, “Spreading the word about what testing is about, to nail those requirements in the early stages, and be good at describing them, so that you can test against them.”

The partner’s platform also enables greater scope for building relationships with testers from across the globe.

“One of things that we like about the partner’s environment is that we can build up a knowledge and experience of the people participating, so I am pretty certain that over a short number of repeats of the process we will 1) start to identify people who recognise us as someone that they would like to test for and 2) we will have built up an experience of people who have been good at spotting things that we ourselves had not.”

Dave Underwood, Technology & Information Services Programme Manager, The Met Office