security

Performance testing is not a luxury, it’s a necessity

Recently I chanced upon a post posing the question of whether Software Testing is a luxury or a necessity. My first thought was that testing should not be a luxury; it’s much more expensive to face an issue when it arises in a live system, so if you can afford to do that, perhaps that’s the true “luxury”. However, testing is now accepted as a fundamental aspect of the software lifecycle and Test-Driven Development (TDD) stresses this aspect.

Unfortunately too often software testing is understood only as functional software testing and this is a very big limitation. Only if your software is supposed to be used by a very small number of users can you avoid caring about performance; only if your software manages completely trivial data can you avoid caring about security. Nevertheless, too often even the more advanced companies that use TDD doesn’t consider performance and penetration tests; or, at least, they do it only just at the User Acceptance Test (UAT) stage.

Working for a company that is very often requested to run performance tests for clients in the few days before the live release, we are often faced with all the problems that the development team has ignored. It’s hard when we must say “your system can’t go live with the expected workload” when the client’s marketing has already advertised the new system release.

Intechnica is stressing the performance aspect so much that we are now following the “Performance-Driven Development” approach. The performance requirements are collected together with the functional ones, addressing the design in terms of system architecture, software and hardware. Then the performance tests are run together with the unit tests during the entire software lifecycle (following the Continuous Integration practice).

I think that such an extreme approach may not be suitable for everybody, but recently I tested the performance of a new web application for a financial institution. We had already tested it 3 months earlier, but during these 3 months the development team has added new features and fixed some issues, the web interface has slightly changed, and as a result, almost all the test scripts became unusable and had to be fixed.

This tells me that the performance requirements must be considered throughout the entire development stage. They must be expressed and included in the unit tests because that is the only place where defined software contracts are used. Performance tests can be done at the UAT stage, but, just as no serious company would implement a UAT without having first functionally tested the code, so should you measure performance during the development stage. Additionally, an Application Performance Management (APM) tool is highly advisable to monitor the application and to find out the cause of performance issues rapidly in development as in the production environment.

Is testing a luxury? I’d prefer the luxury of a good dinner to the “luxury” of wasting money for an application found unfit for launch on the “go live” day.

This post was contributed by Cristian Vanti, one of our Performance Consultants here at Intechnica.

Advertisements

Webinar: Designing Applications for the Cloud

This webinar, from 6th March 2012, was hosted by Intechnica‘s Technical Director, Andy Still. Andy talked about the key principles of designing and migrating applications to the cloud. This includes scaling out, taking new and imaginative approaches to data storage, making full use of the wide range of products and services on offer from cloud providers (beyond hosting), and exploring the many flavours of hybrid solution which can mean all types of business can leverage the benefits of the cloud.

Andy has architected and built a number of cloud-based applications, specialising in highly scalable, high-performance, business critical applications.

If you’re planning or considering moving to the cloud in 2012 then this webinar is essential viewing.

More Intechnica webinars