Testing Services provides a platform for QA professionals to discuss and gain insights in to the business value delivered by testing, the best practices and processes that drive it and the emergence of new technologies that will shape the future of this profession.

« September 2011 | Main | December 2011 »

November 10, 2011

What are the challenges in SaaS Testing?

While I was doing my research on the forms of cloud that was seeing maximum adoption in the market, I learned that they were primarily of the Software-as-a-Service(SaaS) kind. SaaS is largely being adopted because businesses realize that they can get on-board quickly with SaaS, with very limited upfront OPEX investment.


However, this marked increase in SaaS adoptions has only resulted in subscribing enterprises having higher expectations from their overall SaaS investments. This obviously leads to the demand for "Getting it right the first time", which brings in the entire focus on SaaS Testing.  SaaS testing itself comes with its own set of challenges like live upgrade testing, validating data integrity and privacy, validating data migration from existing systems/other SaaS systems, validating integration with enterprise applications, validating interface compatibility and shorter validation cycle times.


To find out more about the challenges that organizations face in Saas Testing...read my POV also co-authored by Shubha Bellave at http://www.infosys.com/IT-services/independent-validation-testing-services/white-papers/Documents/saas-testing.pdf and do share your comments and feedback.

November 8, 2011

How is Testing Cloud Based Applications Different From Testing on Premise Applications in QA Clouds

The cloud has been a unique revolutionary force which has driven enterprises, market analysts and end users go "gung ho" about it because of the impact that it has created as an Infrastructure as a Service (IaaS), Software as a Service (SaaS), Platform as a Service (PaaS). Besides this, the Cloud also comes with various deployment models like the private, public, federated and virtual private clouds, making it highly flexible and versatile from a standpoint of meeting business requirements successfully. 


I have started to see organizations beginning to adopt clouds for their QA needs primarily for two reasons - either they have a stream of cloud based applications rolling out which they want to validate or to overcome their existing QA infrastructure limitations.  Often organizations ask me questions like "Does our testing methodology need to evolve when we need to test applications on the cloud?" and "how is cloud testing different from traditional testing?"


Well let's look at providing some answers to these commonly posed questions.



Testing On-Premise Applications in QA Cloud 


Organizations turn to cloud based QA environments to overcome their existing QA infrastructure limitations. In such a scenario, the traditional/on-premise resident applications are tested by deploying them on QA clouds.  The cloud is leveraged as an infrastructure-as-a-service (IaaS) model, which is predominantly availed in a private or a public cloud deployment model. The testing methodology would not vary in these cases when on-premise applications need to be tested for functional, performance and security requirements. The QA cloud provides resources for a production like infrastructure and testing tools needed for planning the capacity for an on premise application's performance testing. When an on-premise application is being tested on the cloud, it would also need a performance benchmarking exercise, where the performance of the application in an on-premise production environment or in an on-premise staging environment is benchmarked against the performance of the application in a pre-production QA cloud.


Testing Cloud based Applications in QA Cloud


Organizations use different forms of clouds (IaaS, SaaS, PaaS), and in various combinations, to roll out cloud based applications to gain a competitive edge.  Testing cloud based applications include three scenarios :


1)    A portion of application is migrating into the cloud,

2)    An application has been completely migrated onto the cloud and

3)    An application is completely built on the cloud itself


The testing methodology has to evolve in all these scenarios and would need to take into account the virtualized infrastructure, network, application business logic, data and end-user experience.  Testing cloud based applications requires business workflow testing, exceptions mechanisms, simulating failure scenarios and disaster recovery scenarios. When we include cloud and enterprise application integration scenarios also, then testing would also need to include comprehensive integration testing, API testing and billing mechanism testing (in case of SaaS applications). Scenarios that involve partial or complete migration of an application into the cloud need to be tested for data migration, data security and data privacy. The focus of testing cloud based applications needs to include validating the accuracy of cloud attributes like multi-tenancy, elasticity, security, availability, interoperability and metering in multi-instance loaded environment. Security validation would call for cross site scripting, script injection, user access & roles testing, cookie & session isolation testing and multi-tenancy isolation testing.


We need to remember that there is no single or ideal approach for cloud testing. This is primarily due to the fact that when an organization embarks onto cloud testing, various factors like the cloud architecture design, non-functional and compliance requirements, etc., need to be taken into account to ensure successful and complete testing.

November 3, 2011

Collaborative Testing Effort for Improved Quality

The collaboration amongst the business, development and testing teams can reduce the risk during the entire software development and testing lifecycle and considerably improve the overall quality of the end application. As a testing practitioner, I believe that the testing teams need to begin collaboration at an earlier stage as described below rather than the conventional collaboration during the test strategy phase:

·         During the requirement analysis phase the business/product teams need to collaborate with the development teams to validate the requirements.

·         The test data needs to be available earlier and the testing teams need to collaborate with business/product teams to validate the test data for accuracy, completeness and check if it's in sync with the business requirements spelled out.

·         Collaborate with the development team and share the test data which can be used in the unit/integration testing phases.

·         Collaborate again with the business teams to formulate a combined acceptance test strategy which would help reduce time to market.

·         Collaborate with the development team to review the results of unit testing/integration testing and validate them.

·         Collaborate with business/product teams to validate the test results of the combined acceptance testing.

Testing at each lifecycle stage has its own set of challenges and risks. If the potential defects are not detected earlier they escalate further down in the SDLC. However, an experienced and competent test practitioner can identify these defects earlier on, when it originates, and address them in the same stage. Below are some examples which reinstate this fact.


·         A good static testing of the functional, structural, compliance and non-functional aspects of an application during the requirement phase can reduce 60% of the defects from cascading down to production.

·         Similarly, getting all the required test data (as specified by the business requirement) as early as towards the end of requirements analysis phase can inject the sense of testing early in lifecycle which would improve test predictability.

·         Planning ahead for performance, stability and scalability testing during the system design phase can help reduce the costs of the potential defect incurred later on. Also, proactive non-functional testing (as required by business) contributes significantly for faster time to market.

·         Test modeling during the test preparation phase helps avoid the tight coupling of the system that is being tested with the test environment. This eventually helps in achieving continuous progressive test automation.

·         Collaboration with the development teams ensure that they have used and benefited from the test data shared by the testing teams. This collaboration helps the testing teams validate the architecture, design and code by following simple practical in-process validations through static testing of the functional, structural and non-functional aspects of the application.

·         Mechanisms which help predict when to end testing is a key requirement during execution. One such mechanism is a stop test framework based on the understanding of the application carved around the defect detection rate.

All the approaches described above let testers save on time and focus more on the right metrics collection and maintaining dashboards during test execution. It also ensures that testing is not limited to just one phase but is a fine thread that runs across the entire SDLC in order to improve quality, reduce costs and time to market for all business applications.

The benefits of this collaborative approach are many. I have listed a few benefits based on my collaborative team experiences:

·         De-risks the entire software development process by embedding testing as an inherent part of each stage of the SDLC process.

·         Defects are found early in the life cycle which reduces the total cost of fixing the defect at a later stage. The cost ratio between finding the defect at the requirements stage vs finding the same at the production stage is 1:1000.

·         Shortens the time to market by using this approach which has a built in self-corrective mechanism at each.