Testing Services provides a platform for QA professionals to discuss and gain insights in to the business value delivered by testing, the best practices and processes that drive it and the emergence of new technologies that will shape the future of this profession.

« October 2016 | Main | March 2017 »

November 26, 2016

Predictive analytics - An emerging trend in QA

Author: Indumathi Devi G., Project Manager, Infosys Validation Solutions

As digital transformation is rapidly changing business operations, quality assurance (QA) also has to change from traditional quality control to intelligent quality assurance. Nowadays, clients not only want to test software adequately, but also as early and thoroughly as possible. To accomplish these goals, it is important to opt for shift left testing and predict the failures even before the applications are handed over for testing. Today's business dynamics require QA professionals to make critical decisions quickly. It is imperative to make use of the avenues, such as customer feedback, defect data, and test results available at disposal to make prompt decisions.

What is Predictive analytics?

Predictive analytics is the practice of extracting useful information from data sets using statistical algorithms and machine learning in order to determine patterns and predict future outcomes and trends. It is a data-driven technique, which can be leveraged to predict failure points in testing activities and determine the future. It has the power to help optimize project data and make proactive decisions. Predictive analytics by using statistical algorithms helps us to identify patterns in the data and provides an accurate forecast on how the data behaves in the future.

Predictive analytics uses several algorithms to process the data. Some of them are as follows:

  • Regression algorithms
  • Time series analysis
  • Machine learning

Why analytics in QA?

Predictive analytics is widely used in most of the industries today. Testing has never been an easy activity and it involves lot of aspects that need to be efficiently managed for better results. Need of the hour is, software and especially QA teams have to leverage analytics to streamline and seamlessly perform software testing activities.

Today, various technologies that we use and tasks we perform in software testing life cycle (STLC) generate enormous amounts of data. Storing that data, and analyzing it using state-of-the-art tools and analytic solutions in a timely manner will make that same data work for you instead of simply taking up space on a hard drive.

Predictive analytics is not a one-off activity. We need to continually analyze and infer insights and make adjustments in QA practice for better results. It is also important to have sufficient data to make reasonable predictions.

Customer is the king and his feedback matters a lot

It is business critical to listen and react to the valuable customers' feedback. Sentiment analysis is extremely useful in social media monitoring as it allows us to understand the customer feedback about certain products or applications. Sentiment analytic frameworks will make that process quicker and easier than ever before.

Collect the customer sentiments through proven means from the possible sources and use analytics techniques to arrive at insights. This helps QA teams to identify the areas they need to focus based on the issues reported such as compatibility issues, performance issues or functional issues faced by customers. Embrace customer centricity while strategizing QA to deliver better quality and improved customer experience.

Customer feedback analytics

  • Helps in identifying key issues of the customers when they use the digital channels
  • Provides insights to prioritize testing and increase QA efficiency

Make social analytics as one of the key inputs to formulate the QA strategy. Data captured from social media gives insights into customer's sentiments. It helps in identifying areas of focus from past performance based on negative sentiments and helps in decision making. It provides a 360-degree view on behavior of applications in production as well as its impact on customer's sentiments. It helps QA teams to minimize risks, increase agility, and bring in customer centricity to QA approach.

Information is wealth and make the best use of it

Each and every task performed in QA process generates enormous amounts of data. Every time you run a test, you are creating log files, logging defects compiling test results and reports. Defect logs, test results, production incidents, project documentation, application log files carry lot of details and when used intelligently can make wonders.

Examine defects we identify in test and production environments and assess how that impacts customer experience. Identify critical issue patterns and align test scenarios to ensure adequate coverage. Data, combined with predictive analytics algorithms, can allow you to find patterns in data and it can help you make increasingly accurate predictions about the future failures based on that data. For example, optimize the order processing workflow in a retail website based on the data that shows on which step many of the customers log out of the site during a transaction. Root cause analysis of defect data will reveal the hotspots of the application and will help in risk-based testing. Analyzing defects help QA teams to prioritize and optimize testing and helps for faster and focused QA.

Apply machine learning algorithms to mine the test case repository, arrive at an optimized regression suite, and figure out any duplicate or redundant test cases. Using analytics on the previous test result will help in forecasting the future pass rate and will help QA teams to focus on the unstable modules of the application.

QA teams have to opt for tools, which continuously monitor the application log files and trigger the relevant test scripts in an unattended manner. This will help in early detection of potential failure areas to take preventive action, thus reducing potential defects.


Going beyond the traditional QA methodologies and taking an analytics-based approach has become key factor in the next generation of QA. Predictive analytics helps in predicting the future failure looking at the past data and taking the proactive measures for future.

Predictive analytics in QA

  • Gives insights on the customer sentiments that help in customer-centric QA
  • Provides insights to start, stop, and prioritize testing
  • Increases testing efficiency and predictability
  • Improves customer experience
  • Reduces overall cost due to early defect detection
  • Accelerates time to market

Predictive analytics helps in improving efficiency and effectiveness of the QA operations and ultimately helps in better understanding the end user. I strongly recommend to use predictive analytics to deliver beyond the reach of traditional QA practice.

We would love to discuss about the Infosys Predictive Analytics in QA solution and many other such solutions and service offerings with you. Infosys is a Silver sponsor of HPE Discover 2016. Do drop in at Booth #134 for a quick chat. More information on our participation is here.

November 22, 2016

A large brewer decodes social media with Infosys

Author: Surya Prakash G., Delivery Manager, Infosys Validation Solutions

Digitization has become the buzzword in every industry vertical, as end consumers have been swept away by the digital world. The advent of Internet of Things (IoT) implies that smart products, services, factories, and operations are replacing traditional ones. Data from social media is enhancing decision making to increase revenues by impacting unstructured data for analysis. This is leading to an exponential increase in data analysis, interpretation, and the way data from social media is used in a meaningful form. With this, the focus on data testing has to move beyond volume and variety, to include velocity and veracity of data.

Recently, Infosys validated data from social media channels, by using the Infosys 4D++ framework to achieve faster time to market and also reduce cost for a global brewing client.

This client, a leading beverage and brewing company, generates over US$1 billion annually in revenue and selling more than 200 brands. It has offices across 24 countries and has a 25 percent global market share. We partnered with the client to enhance its focus on enriching relationships with customers and the communities. Our big data implementation enhanced customer data by using third party feeds collected from social networking sites. Subsequently, enriched customer data helped in targeted campaigning and business expansion. The objective of this complex testing endeavor was to validate huge volumes of unstructured data coming from different social networking sites and verify data quality and reports.

The following lists some of the key components of our testing approach:

  • End-to-end testing to ensure successful consolidation and implementation of huge data coming from numerous sources
  • Setting up a stable testing environment
  • Robust functional and user interface (UI) testing including look and feel testing
  • Consolidation and validation of data from multiple sources which are in different formats

To solve these challenges, Infosys came out with an approach to validate 100 percent of the data by testing at each conversion stage that covers all data permutations with automated tools. Also, end to end data validation was covered right from data ingestion to data visualization (including mobile validation). Data validation was performed using automated utilities developed for different stages of data conversions.

Here are the key benefits of our solution:

  • Reduction of cost of quality (COQ) by 10-15 percent through early detection of data issues at source during data ingestion
  • 15 percent reduction due to automated data validation by using utilities at various stages
  • 100 percent test coverage through automated approach for validation of all scenarios
  • Time optimization using query repository

We would love to discuss this and many other such exciting implementations on Big Data testing with you. Infosys is a Silver sponsor of HPE Discover 2016. Do drop in at Booth #134 for a quick chat. More information on our participation is here.

A.E.I.O.U of New Age Quality Engineering

Author: Srinivas Yeluripaty, Sr. Industry Principal & Head, IVS Consulting Services

In today's digital world, 'change' is the only constant and organizations are grappling with ways to meet the ever-changing expectations of key stakeholders, especially ubiquitous consumers. With the GDP transformed by mobile economy, globalization leading to "Global One" customers, payment industry transforming from "Cashless" to "Card-less" to "Contactless" transactions, ever growing emphasis on security and compliance, the expectations on IT are reshaping significantly. To achieve that pace and flexibility, organizations are increasingly adopting agile methods and DevOps principles.

The State of DevOps survey, 2016 (https://puppet.com/resources/white-paper/2016-state-of-devops-report) provides some very interesting findings. High performing organizations in the DevOps space who were able to multiply deployment frequency by 200 times between 2015 and 2016, are able to recover from failures 24 times faster, lower their change failure rates and shorten lead time for deployment.

So what does this mean for testing? If you speak to any 'Head of Testing' these days, the key questions in their mind are: Is testing really going to be important? How will my testing organization survive in the Agile / DevOps world? Where will the ownership for testing and release quality stay? Is my test organization ready to adopt DevOps?

The testing function, which certifies the quality of software and products, will definitely need to undergo changes, and will need to focus on creating better value and agility. With DevOps success criteria focusing on "Deployment frequency, Lead time for changes, Mean time to recover, Change failure rates" there is a greater emphasis on quality of the small chunks / features that move into production and hence greater need for testing, especially a quality engineering-centric testing approach.

Quality engineering brings in a significant shift in testing processes, roles, automation stack and testing engagement models, and helps achieve 'Engineered Quality". Infosys' renewed approach to Quality Engineering-driven transformation is simple, with the acronym "A.E.I.O.U." which stands for: Automate, Eliminate, Integrate, Orchestrate and Uberize.

The Quality engineering approach will help faster adoption of emerging testing paradigms and below is the summary of those changes:

  • The traditional 'Quality Assurance' methodology of testing as a phase after the completion of entire development process will change to using Continuous Testing throughout Agile / DevOps cycle. 'Engineered Quality' will become the key to testing success.
  • The ability to apply AI/ cognitive algorithms and testing techniques to gain actionable insights so that an enterprise can focus their testing efforts on the areas that matter most
  • With the adaptation to shorter release cycles and continuous deployment, the time to test will significantly reduce. Hence Automation and Optimization will be the highest priority.
  • The automation test suite should be able to intelligently identify when to kickoff certain tests, when to stop and when to restart to achieve seamless CI-CD pipeline testing.
  • Dual Shift Strategy: Use shift left (QA working closely with development for early performance testing, security testing) and shift right (incorporating continuous quality across Dev & QA deployment)
  • Testing roles transformation to full Stack Testers who are capable of testing front, middle and backend apps
  • Intelligently manage the data and environment dependencies
  • Shift from independent testing COEs to federated COEs and consumerize the ways of testing

IVS Consulting, a global group of test consultants, is driving this change for our customers across the globe. For more details, visit https://www.infosys.com/IT-services/validation-solutions/service-offerings/Pages/quality-engineering.aspx.

Infosys is a Silver sponsor of HPE Discover 2016. Do drop in at booth #134 and learn how Infosys experts can help in your digital transformation initiatives. More information on our participation is here.

November 21, 2016

SAP Test Automation approach using HP UFT based solution

Author: Kapil Saxena, Delivery Manager

Problem statement
Most businesses that function on SAP, or plan to implement SAP must consider multiple factors, as their entire business runs on this backbone. Their major worries are testing effectiveness, preparedness, cost, and time to market. Infosys SAP Testing Unit has answers to all four, which have been well implemented and proven but I am reserving this blog for the last two.

Companies planning SAP implementation often tries to think delivering in Agile/DevOps model, but we have not seen this model work out frequently. The issues are mostly on the system integration side when the error correction code (ECC) finally interacts with hundreds of legacy systems to make end to end work. I will focus on how we can make pure SAP implementations a possible candidates for agile testing.

Approach to be taken

We chose HP Unified Functional Testing (UFT) to build our solution with a simple reason that it is open to code changes outside the tool itself. It gives our engineers the flexibility to make changes on the fly directly in the code rather than to record the changes again. The testing also provides flexibility  to create reusable functions to handle exceptions and make intelligent logical conclusions based on data collected over the years of testing SAP ECC and learning the behavior. As we all know that testing any system in reality is way different than doing the same in an ideal working environment.

Solution: Our functional consultants gathered a list of the most frequently used transaction codes (t-codes) and their variants from multiple hands on experiences. This helped us create repository of 1,500 reusable t-codes in a solution called Infosys Test Planning Accelerator (ITPA). These run uninterrupted in SAP vanilla systems. We took the next step, and using our proprietary automation framework, automated each t-code. The uniqueness of our solution ensures that we take a maximum of 15 minutes to customize one single t-code to fit into any client landscape. With this solution, any company implementing SAP ECC, gets a pre-automated repository of important t-codes which can be made ready to use in a very short interval of time. This comes in very handy when quick system testing is required. We propose the following approach to our clients in order to achieve close to agile testing:         

  1. Developer makes changes in advanced business application programming (ABAP) code of a t-code and pushes it to the development environment
  2. A tester, who is associated with developers, is intimated about this. He/she runs the automated t-code in this environment. In case of failure, the tester quickly fixes the code and runs it to success in 15 mins. This helps to perform unit testing for as many data sets as possible, thus reducing the chance of any defect in upstream integration. This also makes the t-code ready for later use
  3. Once all the t-codes are updated and unit testing is done, the solution is ready for system testing. In this phase, we arrange all the t-codes for each system, say for example, sales and distribution (SD), in a sequence and run them in batches to ensure that the overall system is working properly. These are again reusable from the previously automated t-codes; resulting in less rework and high reuse
  4. With this, we come to system integration testing. Once we have identified the end-to-end scenarios, our framework has another solution called Infosys Automation Orchestration Platform (IAOP), which stitches the t-codes together to create a complete scenario from different permutations and combinations of the same t-codes. HP Application Lifecycle Management (ALM) gives us the flexibility to stitch them seamlessly, so that we are ready to complete the system integration testing (SIT) in probably half the time as before.

This process of testing SAP system can be used both in waterfall and agile form without any change in the scripts. The only change we request from our clients is in the agile methodology, where we want to be closely associated with each developer and seek access to the code drop in the development environment along with master data.

Business benefits

With two solutions and a unique approach, we help our clients achieve the following:          

  1. Reduction of around 40 percent of efforts on a greenfield implementation by making use of the ITPA (Infosys Test Planning Accelerator) which is a repository of existing t-codes
  2. About 40 percent jumpstart in test automation with pre-automated t-code repository
  3. This process is a set base for any type of testing, be it waterfall or agile so fit for both approaches without any change in process
  4. Considerable reduction in time to market. We have reduced the overall time to complete testing in certain cases by 75 percent, using significant automation and the way we plan and run them.

Know more about our offerings here.

November 18, 2016

Darwin and world of Digital transformations

Author: Shishank Gupta - Vice President and Delivery Head, Infosys Validation Solutions

When Charles Darwin proposed the theory of 'survival of the fittest', I wonder if he imagined its applicability beyond the life forms. Since the advent of the internet, the bargaining power of the consumers is steadily increasing and product and service providers often find themselves playing the catch up game to provide the best of the product features bundled with the best of consumer experience. What would Darwin's advice to product companies in today's Digital world be?

There are many aspects that define the features and experiences consumers look for today.

Speed - It is usually believed that high speeds are not a good thing, but it's different in the digital world. High speeds are probably the most sought after thing by the consumers. Speed in the digital era has multiple connotations. It could stand for frequent and updated feature list being made available to the consumers. It could be a measure of the performance of the web or mobile app or it could also signify the time to provide a fix to a problem reported by the consumer. The product companies on their part need to therefore be equipped with tools and processes that help them deliver the different dimensions of speed mentioned above. It could be through agile and DevOps platforms, cloud hosting, optimizing and automating regression suites.

Seamless Consumer Experience - With the plethora of options available to the consumer, consumers will choose the one which maximizes their gains and minimizes their losses. Classic 'Prospect theory' by Kahneman at play. Consumers expect anything, anytime, anywhere with best experience.  With multiple channels like web, mobile, social media and series of smart connected devices/ wearables, the associated technology complexities have amplified. There are various ways enterprises can ensure trust and seamless consumer experience. One of the most vital is security. Enterprises cannot afford to ignore the impact of security vulnerability in their product or offerings. The number of data security breaches the world has seen in the last few years emphasize the importance of security. Security of data or transactions is critical from both the enterprise and consumer perspective. Other aspects like performance, availability, accessibility and usability are equally critical to ensure unified user experience. Extensive validations need to be performed on apps/website/ devices to ensure there are no inadequacies in these parameters influencing consumer experience. In digital world, enterprises also have multiple avenues to listen to their customer's sentiments and use the feedback/intelligence to build the desired experience.

Data - We have all been advised of the ill effects of 'one shoe fit all' approach and this cannot be more true in digital world. Customers look for what a brand has to offer them specifically and data plays a crucial role in this journey. Data will determine supremacy in this vastly competitive world and turning data into useful information is critical. Data transformation and business intelligence becomes backbone of a right digital strategy. However, there is humongous amount of data available from diverse sources and in various formats and organizations need to have strategies to validate sanity of data and to convert the right data into meaningful insights.

Digital is disruptive and not a one-time activity. Like to maintain healthy body weight you need to unceasingly follow a healthy diet and exercise regime. Results of shortcuts like crash diet do not last long! Similarly, to be successful in the digital transformation, there has to be continuous focus on the key areas of speed, customer experience and data analytics. Enterprises need to bet big on innovation and keep adapting and evolving to "survive" in the digital era.

Infosys is back at HPE Discover this year and is a Silver sponsor of this prestigious event. We will showcase our range of new-age service offerings and solutions in booth #134. I invite you to meet our seasoned practitioners, engage in exciting conversations and learn how we can help you in your digital transformation journey. More info on our participation is here.