Is Performance Testing really Non-Functional Testing?
Author: Navin Shankar Patel, Group Project Manager
The world of testing has evolved over the years and like most evolutions in the technology world, it has spawned a plethora of methodologies and philosophies. We now have super specializations in the testing world - UI testing, data services testing, service virtualization, etc. However, some beliefs remain unchanged. Especially the belief that the testing universe is dichotomous - characterized by functional and non-functional testing.
This duality of testing universe has its roots in the traditional way of application development, in which the software development life cycle has requirements, design, development and testing as distinct and sequential activities. The requirements are broadly classified into functional and non-functional requirements. Functional requirements broadly cater to needs like screens, workflow, reports and non-functional requirements to elements like performance, security, etc. In the traditional world, functionality testing precedes non-functional testing (NFT) and more often than not non-functional testing is the last activity to be performed before deployment. The unsaid, but commonly acknowledged view is that functional testing has more significance in this bimodal world of testing. In this view, if the application screen shows the required details, we adjudge that the functional verification is complete and then we move on to see if the screen response time is within the desired limits.
In practice, NFT and Performance Testing are treated synonymously. But is performance testing really non-functional anymore? If 'functional' is defined as something that customer needs, can we say that the performance expectations from customer has less significance?
It is being increasingly acknowledged that customer experience is of paramount importance in the digital world. It is almost as if customer experience has become mission critical. Gone are the days where it was sufficient to validate the screen elements or the fields of a report in a batch process. Performance testing was an afterthought. But with a generation of users indulged by the lightning speed of a Google or an Amazon and customer experience centered on the speed of application performance, it is passé to treat performance testing as non-functional. It is, in fact the most desirable customer requirement and is the primary functional requirement!
In fact, the traditional definitions and processes of performance testing may not be applicable at all in the changing scenario. How does one go about defining performance testing process for a Big Data-driven realtime analytics application? Do we just test for the so-called functional aspects and leave the performance testing to a later stage? When the term 'realtime' itself signifies performance, is it not imperative that we test the performance aspects in parallel along with the traditional functional features?
The change in thinking in performance testing is not only necessitated by the technological disruption in the form of Big Data or IoT, but also due to rapidly changing software delivery processes. With many corporations adopting DevOps / Agile methods, the release cycles of enterprise software is reducing. In the older world, Performance Engineering (PE) and Performance Testing (PT) occupied the two extremes of a development cycle. While PE focused on engineering aspects and pre-emption of issues, PT focused on postmortem and validation aspects. With the changed dynamics and reduced release cycles, performance testing is undergoing 'shift-left', where a lot of PT activities are moving much earlier in the life cycle. It may not seem exaggerated to claim that PE and PT will eventually converge. It can be said that Quality Assurance is moving towards Quality Engineering!
In this light, IT organizations may need to fundamentally change the way they approach Performance Testing.
1) Treat nonfunctional requirements on par with functional requirements
2) Create Early Performance Testing (EPT) strategies for projects with longer gestation cycles
3) Create PT strategies to enable shift-left and integrate elements of PT in Unit Testing and SIT
4) Leverage and integrate tools like Application Performance Monitoring (APM) in unit testing and SIT phases
5) Redefine the performance metrics and performance testing processes in the newer areas like Big Data and IoT