Testing Services provides a platform for QA professionals to discuss and gain insights in to the business value delivered by testing, the best practices and processes that drive it and the emergence of new technologies that will shape the future of this profession.

April 14, 2014

Changing perspectives in Testing - Adapting to evolving expectations

In the past decade or so, the basics of testing process and tools have not really changed. Different tools to automate various phases of testing have been developed with these tools seeking to realize specific concepts in automation. At its core, testing still remains an activity directed to seek and seize defects and fix them before the system goes live in production or gets launched in the market. So what is new and changing in testing?

It is just that testing professionals have to adapt to some changing scenarios while still holding on to time tested principles and techniques. There are some subtle and obvious contextual changes that the testing teams need to be aware of and adapt in order to stay relevant and deliver progressive value. Some of them are to do with 'mindsets' while some others are to do with enhancement and evolution of existing techniques:


1.    Collaborative Testing. With the increasing adoption of Agile and progressive towards realizing DevOps vision, the testing team needs to shift focus from bug detection to early and continuous feedback  and contributions to improving quality. This implies testing early and often rather than 'test after developers are done'. This also implies high degree of comfort with skeletal documentation and ability to extrapolate and visualize requirements.


2.    Continuous Test Automation. Testing early and continuously also necessitates using a wide spectrum of tools (commercial, in-house developed and open source) and scripting languages. This is what I call Continuous Test Automation throughout the project lifecycle. Some also call it extreme automation. This needs programming mindset to be developed combined with the tester's keen eye for finding defects and providing early feedback! I believe all test professionals need to develop these skills; not just the automation engineers.


3.    Visual Modeling. With the need for tighter and frequent collaboration with program team also comes the need to use visual modeling tools. One example is activity models in model driven testing.  I have also seen many teams using mind-maps for capturing test design. Testing community has experimented much with model driven testing. While this helps, many teams have admitted that this is often time consuming and effort intensive. An area that is very little explored is defect prediction and modeling. Recently an Australia based Banking customer wanted to us to propose ideas for defect modeling and visual defect heat mapping techniques. Since testing is expected to 'seek and seize' defects, it is a good idea to focus on defects and failure mode modeling rather than modeling the entire requirements. This area needs further study and experimentation.


4.    Mission Risk Mitigation. This is about addressing the question 'what is the risk that this system will fail to achieve the stated IT mission goals?'  and reporting those risks based on sound analysis of metrics. This calls for thorough understanding of business goals and how the current system under test is expected to contribute to the same. This is what I call 'shift-up' which implies the ability to appreciate the higher order business goals and continuous evaluation of risks supported by various business driven metrics analytics.


5.    Business driven metrics analysis. This is related to the point above. Metrics need to be collected and reported under multiple levels of hierarchy. Such reports must be accompanied by insights and recommendations that help the management to make critical business decisions. An important part of metrics analysis is alerts. An alert is meant to be a call for management action. It is a warning of an impending issue. Often testing team assumes that mere sending of status reports is good enough for management to take necessary action. Far from it. Metrics must be analyzed for trends in time and correlation with other related metrics to draw meaningful conclusions, help decision support, make appropriate recommendations and initiative management actions. Such a roll up of analysis must address the decision support requirements of all levels of organizational hierarchy. Some examples of alerts:


a.     Threshold alerts - a specific metric is below (or above) a threshold value and needs management attention.

b.    Correlation alerts - a specific metric is not consistent with another and needs further analysis. For example - defect fix rate is lagging behind defect find rate for the observation period. Another example of correlation alert is if an area of code has very high degree of churn but defect find rate is relative low; can mean hidden lurking defects that may demand focused testing techniques.

c.     Out of control alert - a specific metric is out of control from a statistical perspective and needs attention


In conclusion, the fundamentals of testing process and technology have not really changed (and probably never will) but the context in which testers do their jobs is changing fast and the testing community at all levels have to adapt to these changes to continue to stay visible and relevant to senior levels of management.

January 20, 2014

Social Media, Cloud, Analytics and Mobility (SCAM)

Social Media, Cloud, Analytics and Mobility: These are 4 common buzzwords that we hear today. They are indeed very much inter-related as well!  Social media allows instantaneous interactions, sharing of news, photos, videos etc. From a technical perspective, this requires elastic omnipresent storage capability.  Cloud provides this for the Social Media. The moment something is on cloud, it can be big - big data. Small data can be hosted locally. If data is big, cloud is a good medium and the data can be leveraged for analytics. This facilitates informed decision making. For an end user, this should be omnipresent, thus available at fingertips. Mobility facilitates that.

Continue reading "Social Media, Cloud, Analytics and Mobility (SCAM)" »

November 14, 2013

Are User feedback streams considered during Design ?

On occasion of World Usability day today, I still have a thought and also wonder if there are  many users who are still struggling with interfaces to perform their intended tasks and whether their voice was heard while the interfaces were being designed, be it a website or an application targeted for laptops, tablets or handheld devices.

The answer would still be inclined towards a 'Yes'. There might be still a good number of unhappy users. With such a number of sites and applications undergoing redesign, change, content updates everyday across geographies, and for so many different target users, there is a probability that target users in scope would not have been available or identified to gather feedback during the design of interfaces.

In a given project, where does the process of user feedback start and where does it end? What is the best phase of design to have such feedback sessions with users?  

Following are the possible phases to have user feedback sessions and as part of iterative design process : 

 1)  Wireframe/concept level (Paper, wire framing tool)

2)   Visual design level (static jpegs images with look and feel and branding)

3)   HTML prototype level

4)   User Acceptance Test

5)   Live version 

Notably, each phase provides different type of feedback on design in progress.

For example, concept/ wireframe phase gives user feedback primarily on basic site structure, navigation elements, information contents, naming convention of menu.

Visual design phase provides feedback on 'look and feel', branding, visual affordance, color and style in addition to the information architecture.  

It is recommended to have such user feedback sessions preferably first during wireframe phase. Simply, as it is easy to make changes in fundamentals of design based on user feedback very quickly and with least efforts. The design is still taking shape in a wire framing tool or even paper prototype. 

 The more delay in having these sessions arranged in later stages of design, more team members need to redo their work ( visual designers, HTML developers) and more cost it adds up to the project budgets.  

 I believe that with the advent of usability term used, realized and usability practice being evangelized across organizations in different domains, user feedback would be forming as one of the core input during design to achieve better user experience.

October 10, 2013

How different is providing Performance Strategy for BPM applications?

In the domain of Financial Services and Banking, Business Process Modeling (BPM) systems are often at the heart of the enterprise applications as these provide flexibility, lesser time-to-market and better regulatory compliance with local laws to name a few. BPM technology is typically used to implement a 'workflow based application' where different 'tasks' are to be performed by different 'user groups' or 'actors' based on the 'state' of the data. It can have actors as humans or the business processes that can integrate with other systems through Enterprise Service Bus (ESB) for a complete business process or business activity.


Continue reading "How different is providing Performance Strategy for BPM applications?" »

Preparing ourselves for Application Security Testing


Haven't we all as functional testers, done 'Authentication' and 'Authorization' testing? Almost every application demands some flavor of these tests. So, are we not already doing 'Application Security Testing'?

Let's explore and see what's the extra mile, we need to traverse, in each phase of SDLC, to say confidently, that the applications we are testing are secured ones.


Continue reading "Preparing ourselves for Application Security Testing " »

October 3, 2013

Crowd Testing

The concept of crowdsourcing is not new. The practice of harnessing ideas, services or content from a larger pool of unknown population has existed for many centuries.  For example Oxford English Dictionary got created through an open call to the community to identify all the words in English language along with their usage; this call yielded 6 million submissions over 70 years! The Indian Government effectively used crowd sourcing to obtain entries for the symbol of the India Rupee which finally led to selection of the current symbol. On a lighter note, in India we do see crowdsourcing all around us. A crowd of helpful volunteers trying to help fix or push-start a broken automobile is a common sight here!

Continue reading "Crowd Testing" »

September 30, 2013

Testing-as-a-Service (TaaS): Take a Peek


In my last post, I wrote how market if adapting to TaaS. Here I am talking "What is TaaS?" and "Why do we need to evolve?".

What is TaaS?

Considering all prevailing aspects of New Age QA, major QA service suppliers are now formalizing their own TaaS model for test outsourcing. Fundamentally, TaaS doesn't necessarily represent a true cloud service, but it incorporates aspects of cloud. This paradigm shift in thinking has made every QA service provider build and continuously mature their TaaS model. TaaS requires us to introduce testing infrastructure, test tools, accelerators, human effort in delivering testing services, delivery methodology, framework and best practices bundled together and apply it uniformly.


Continue reading "Testing-as-a-Service (TaaS): Take a Peek" »

September 27, 2013

Boons of Cloud Testing--Agility and Automation

In my previous blog I have discussed how traditional way of testing makes business less agile and less adaptable to automation. Here I will explain how Cloud Testing is delivering manifold advantages, creating a testing model that is more agile and is full of automation and orchestration.


Continue reading "Boons of Cloud Testing--Agility and Automation " »

September 23, 2013

Testing-as-a-Service (TaaS) - Changing Times

In the recent third party survey of Infosys clients, one message is coming clear that customers are moving away from traditional staff augmentation model towards a single custodian for their entire testing needs. There is increased focus on making testing services predictable, accessible, available without compromising on quality and cost. Customer is looking out for an outcome based sourcing and pricing while outsourcing testing services. We can see how New Age QA is evolving from testing phase to Business Assurance. Customer loyalty has become very important and one can't compromise with the quality of an application and hence in application delivery, software testing remains an essential element of business operational efficiency and risk management.

Continue reading "Testing-as-a-Service (TaaS) - Changing Times" »

September 18, 2013

Retail industry challenges that demand a more mature Testing organization


We live in a digital world and mobile society. It has changed our lives completely, more so for retailers. One bad experience, online or on mobile devices and retail customers are gone forever.. Unlike most industry domains such as banking & financial or healthcare, retail industry is uniquely challenged with high operational overhead and lower profit margin. Overheads include maintaining large number of retail stores where as profit from selling a merchandize could be as low as few pennies. Plus, IT systems are still expected to provide the same level of quality what banking or healthcare industry may require.


In this blog, I am presenting challenges that specifically demand an efficient SDLC process backed by faster software development as well as efficient testing process.


Continue reading "Retail industry challenges that demand a more mature Testing organization " »