Many hands make software work

There was a time when software testing was a neglected afterthought, a tedious chore for developers once their real work was done.

A number of trends are changing that perception, however. First is the emergence of the Agile framework of software development, which advocates shorter cycles of development and testing, thereby positioning software testers closer to the creative process. Another significant development is the advent of web applications. When software is deployed on the Internet, bugs become apparent very quickly and very publically.

Automated testing tools have come a long way, meanwhile, but they are far from foolproof. But while organisations may now have a greater appreciation of the value of software testers, they still represent an extra drain on development budgets. And as the example of online retailer I Want One Of Those (IWOOT) demonstrates, a dearth of testing resources can handicap a development operation.

IWOOT, an enthusiastic user of Agile development, had hired freelance testers to work with in-house developers. But because the supply of code that needed testing was not constant, there were times when these freelance testers were doing nothing, and times when they couldn’t test fast enough, slowing time to deployment.

“Software testing was becoming a bottleneck,” recalls Sagar Vadher, IWOOT’s head of IT, “and we were spending more money than we were getting value from.”

To address this concern, the agency that supplied IWOOT’s freelance testers suggested using U-Test, an online marketplace where over 18,000 software testers of varying skills and experience offer their services remotely. The site allows development projects to ‘crowdsource’ testers, quickly recruiting a large number of people on an on-demand basis. IWOOT decided to try out U-Test’s network of testers to analyse some new functionality it had added to its payments processing application.

It set a budgetary limit (testers are paid a certain rate per valid bugs found) and uploaded the code to UTest’s testing environment. The job attracted 45 testers, who in a matter of hours found 34 valid bugs in the software.

“Three or four of the bugs were things that we hadn’t seen as issues before, but once they’d done the testing we realised they were a problem,” recalls Vadher. “That reflects the wealth of experience that is available in ‘the crowd’”.

For Vadher, the economics of ‘crowdsourced’ testing are highly compelling. “If I wanted to hire a full-time tester, it would cost me £100 or £200 a day. And if we had one person doing these tests, it would have taken them three or four days,” he explains. “But [using U-Test] cost us just $560 (£328), and it is on a piece-wise basis; we switched it off after a few hours.”

A time and a place

U-Test CEO Doron Reuveni acknowledges that crowdsourced testing is not appropriate for all projects.

“This is not suitable for financial trading systems or defence systems, for example,” he says. But for web and mobile application testing, he says, the proposition is unbeatable.

Indeed, 75% of the work conducted via the service concerns applications for the iPhone. U-Test’s customers include software vendors, from start-ups to established companies such as QuickBooks developer Intuit, and enterprise development shops. Typically, they use the service in addition to internal testing capabilities.

“Crowdsourcing is another tool in the toolkit for CIOs or software developers,” explains Tony Prosser, director of TCL, the testing agency that introduced IWOOT to U-Test. “It can be used to complement your exising test cycles, it can be used in prerelease deployment, and we even have one telco customer who is interested in using it for post-deployment testing.”

Each tester on U-Test lists the equipment they use. Customers can therefore source a variety of platforms on which to test their code. “When you are doing validation testing [on different platforms],  using the crowd is crucial,” argues Reuveni, “because if you have to do it in-house, either it will take too long or you won’t be able to cover what you need to cover.”

Paul Herzlich, software testing analyst at Ovum, sees crowdsourcing services such as U-Test as an appropriate tool for certain circumstances.

“If you are testing software that all kinds of strangers are going to use, then why not use a bunch of strangers to test it,” he says. “But if you are testing software that is going to be used by people who have a certain degree of domain knowledge, you should probably be a little more suspect of it.”

“Also, it depends on what kind of testing you need to do,” he says. “For testing user interfaces, sure – it makes sense.”

The typical enterprise application, however, needs to be tested by people with domain experience. But while the crowdsourcing model might not be able to provide that today, Herzlich sees in it the potential for a system that allows organisations to use their own non-IT employees’ time to test internally developed applications.

“If you took the organisational model of crowdsourcing, but did it in a less synchronous way – if people could pick it up at lunchtime, or when trading is slow – it could be a very useful technique for more formalised testing,” he says.  

Pete Swabey

Pete Swabey

Pete was Editor of Information Age and head of technology research for Vitesse Media (now Bonhill Group plc) from 2005 to 2013, before moving on to be Senior Editor and then Editorial Director at The...

Related Topics