The market for software development, and the subsequent need for Quality Assurance (QA) testing, grew at an almost exponential rate during the Internet boom of the 1990s.

Testing companies flourished, new tools developed, and processes were refined. There was a high demand for skilled QA engineers. Things changed when the market declined sharply in the early 2000s. The demand for quality software was still there, for the most part, but budgets shrank.

Companies had to rethink the development process.

Some companies, Microsoft for example, have adopted a sort of ‘hot fix’ model where apparently very little QA testing is done on software before its release.

By releasing a never ending stream of updates and patches Microsoft essentially uses their customers as beta testers. Microsoft’s unique place in the market makes this model possible for them, but it might prove catastrophic for smaller developers.

A study done by the National Institute of Standards and Technology shows that the cost of fixing bugs after production increases 5 to 32 times when compared with fixing the same bug during QA. It’s obvious that the real value in quality assurance comes during development.

The widespread adoption of Agile development shows a more realistic alternative.

Unit testing, incorporated into each Scrum, speeds the QA process and makes it more adaptive to design changes. This fits perfectly with Agile’s philosophy of producing working software quickly and incrementally. Unit testing at every step also allows for quicker and more comprehensive integration testing at the end of the development cycle. Last year’s release of Xamarin’s Test Cloud is a prime example of the ongoing advances in tools and testing frameworks.

A recent study by Capgemini and Hewlett-Packard projects that, as a percentage of IT budgets, QA spending will increase to 28% this year. This number seems very high, making quality assurance look like a good target for budget cuts.

On the other hand, what are the costs of poor quality assurance practices?

Honestly, it’s hard to quantify. It’s difficult to find hard numbers, for several reasons. Primarily, the data needed to calculate COPQ are often subjective when dealing with software. For those interested, here is a great paper on calculating Cost of Poor Quality from the International Journal of Soft Computing and Software Engineering.

It is much simpler to answer that question qualitatively, though. Customer Experience. In an increasingly competitive software market, bad software is a bad idea. In the retail market a single bad app has the potential to ruin your brand. In the corporate world liability issues arise if software doesn’t meet agreed upon standards and SLAs,

“Consumer expectations regarding apps are really high, so when people’s experience is not satisfying, they are going to go elsewhere and look for an alternative. It’s therefore very important for app developers and service providers to test and optimize.”

Jonathan Freeman, Professor of Psychology, University of London, Managing Director of i2 media research at Goldsmiths

Maybe it’s time to rethink your quality assurance practices.

 The cost of quality QA represents a significant portion of a projects overall budget. The costs of not doing it well can be even higher. Better practices and tools can help you save money while maintaining, or perhaps even improving, quality.

Quality Assurance is a necessary step in delivering a great product. At Seamgen we take proud in our QA process, let our tester find errors in your product. They won’t judge you, but your customers might. Contact Seamgen today for a consultation.