Testing Services provides a platform for QA professionals to discuss and gain insights in to the business value delivered by testing, the best practices and processes that drive it and the emergence of new technologies that will shape the future of this profession.

« August 2016 | Main | October 2016 »

September 30, 2016

Starwest 2016- Infosys is a Platinum sponsor

Author: Pradeep Yadlapati - Global Head of Consulting, Marketing, Alliances, and Strategy 

Starwest 2016 - As the curtains rise, we are looking forward to another year of exciting conversations with customers, prospects, and partners. This provides a fantastic platform for us that offers a great way to stay connected with practitioners, evangelists, and product vendors.

We are the Platinum sponsors yet again and are excited to be showcasing some of the innovations and specializations that we are creating around QA. One of our Retail Customers will be speaking at Starwest 2016 on Oct 5th about end-to-end QA automation framework that delivered 85% automation and reduced defect leakage significantly. Learn more about it here.

We have observed a paradigm shift in the QA processes and how it has transformed in the recent years, the traditional approach is now being questioned based on the current limitations. We have seen a change in the landscape, the CXO's are now focused on achieving higher speed-to-market, accelerate enterprise automation, and deliver better user experience, thereby maintaining high quality. Needless to say, most of the conversations are gravitating towards Digitization and Digital agenda.

Every customer that we spoke to were trying to do something on Agile/DevOps. Their challenges arrayed from -- industrializing the approaches like TDD/BDD, dealing with a plethora of tools, automating everything, deploy faster, cheaper, increase user experience, etc.

The good news is that there are solutions and approaches that we can help clients with. Over the last year we have built and deployed several innovative QA solutions by extensively adopting techniques like Artificial Intelligence, Machine Learning, Software and Hardware Robots that address the Digital age challenges. These include optimization techniques, predicting quality of your sprints, touchless execution of automation scripts by identifying defects during development cycles and many more.

Being the platinum sponsors of the event last year, some of the key learnings for us was the nature of the topics discussed at the event around- increased automation, efficiency of QA, data management, managed services, etc.

One of our financial services clients also spoke about fostering innovation in every engagement through predictive analytics, service delivery platform, command center, etc. These innovations resulted in reducing requirements volatility, improved quality, lower defect leakage, higher levels of automation that transformed their QA organization. The session witnessed highest attendance at the forum and provided a platform for several companies to connect with their respective customers and share best practices.

We look forward to having such conversations again. Visit us at booth #59 (Disneyland Hotel, Anaheim, CA) to hear about these innovative techniques and how we are changing the face of QA. Looking forward to seeing you there. More info on our participation is here.

September 27, 2016

Trends impacting the test data management strategy

Author: Vipin Sagi, Principal Consultant

Test data management (TDM) as a practice is not new. It has been around for a few years now. But only in the last decade, has it evolved at a rapid pace with mature IT organizations ensuring that it is integrated into the application development and testing lifecycles. The key driver for this has been technology disruptions, compelling IT organizations to deliver software faster with high quality and at low cost.

TDM as a practice is being impacted by three major trends - the digital revolution, increase in the adoption of data analytics, and pressure from the business to delivery software faster. Test data architects and consultants should have a measure of these changes while creating and implementing enterprise test data strategies. Some key trends and solution patterns that must be considered while creating effective test data strategy are agile, DevOps, big data, cloud, service virtualization with TDM, domain-specific solutions and accelerators, and automation to reduce time-to-market and improve efficiency. These trends are very visible in product vendors roadmaps and is an expectation from the IT organization during their TDM journey. Many product vendors are investing heavily or acquiring niche players to offer capabilities and solutions on these trends while service providers are focusing on building domain solution and accelerators from their vast experience to deliver services faster and cheaper.

Agile

In agile delivery methods, the software should be delivered faster in short delivery cycles [Sprint] and with high quality. The typical sprint planned is for about 3-4 weeks and testing gets approximately two weeks. In each sprint, the test teams create, run and automate tests for which the test data management team may need to perform data refresh and provide the test data needed to execute the tests, both manual and automated. Test data architects have to ensure optimal data sets with maximum test coverage and accelerated solution to refresh data and provision test data in the environment for testers to complete testing on time. Frequently, situations arise where the test data strategy should be able to support multiple data refresh and provisioning in parallel.
Computer Associates (CA)offers a wide range of solutions which can be integrated with the CA test data maker to support agile capabilities.

DevOps

The adoption of DevOps (agile methodology) practice mandates tighter collaboration of various teams involved in the software delivery lifecycle including development, testing, operations, and release management. Program or product managers have to continuously deliver software which is faster and of high quality by continuous integration.

Test data architects needs to focus on the below factors to enable TDM with DevOps:

  • Leverage database virtualization in TDM for faster provisioning of non-production versions of databases virtually
  • Self-service capability for testers to refresh and provision test data from test data warehouses based on their need and thereby, increasing their productivity 
  • Solutions to integrate test automation tools with test data management to enable the association of test cases with the right test data and seamless execution of regression tests using automation after every build in a continuous integration model

Test data management solution from Delphix and Actifio offers creation of virtual databases with masking and self-service capability to enable continuous integration.

Big data

Big data is a visible technology-driven movement and its strategic relevance is increasing day by day. As the technology supporting big data is evolving rapidly and adoption is increasing, test data architects and consultants will need data privacy and security solutions, which will be a challenge as these platforms deal with high volume heterogeneous data sets.

In fact, IT organizations have broadened their data privacy and regulatory compliance initiative by including non-production environments (test and development) into the scope due to growth in breach cases and complexity that big data applications induce with its data variety and volume. In the current state, testing a big data application itself is a challenge due to lack of proven test frameworks and tool sets. Hence, architects must manage test data and regulatory compliance in their test data strategy. Most organizations are creating small clusters of Hadoop environments for development and testing and the test data is being managed by using data integration platforms and custom solution build using scoop, HiveSQL, and Pig Latin scripts.

lBM and informatica offers test data management solutions for Hive.

Cloud

The typical test data management objectives remain the same for applications hosted on cloud and on-premise. However, cloud may introduce additional challenges which test data architects must consider during their strategy creation or enhancement. These include:

  • Non availability of TDM tools support for SaaS platforms / solutions
  • Network latency that needs to be accounted for during data provisioning for cloud applications

Test data architects need to take care of two aspects: one, in-house applications hosted on cloud and two, third-party SaaS services. SaaS service providers restricts direct access to the database layer. Here, TDM tools may not come in handy. TDM has to take help from SaaS vendors to extract and import data which needs to be accounted for while creating the test data strategy. In case of an in-house application hosted on the cloud, the TDM team needs to provision large volumes of data to perform load and stress testing. This will induce network traffic and latency challenges which the test data architects need to address through custom solutions that run on cloud.

Test data management products vendor such as Actifio, Delphix, Informatica, Solix, IBM and CA offers product versions to run on cloud as SaaS solutions. Also, cloud platform providers offer on-demand solutions for test data refresh and provisioning which can also be looked at as part of the test data strategy.

Test data management with service virtualization

Testing today's complex applications need realistic, reliable, complete test environments which can be provisioned faster on demand. In the world of agile and DevOps, this is needed more while being more challenging. Testers have to perform integration tests on the application with different code versions and test data combinations, and with other applications (in-house or vendor), which is not available currently. Hence, test data architects may need solutions to virtualize application behavior with predefined test data combination (input and output) and orchestrate it with the application under test. Testers will have to create automated tests and run them as part of their sprint execution or continuous integration. Many organizations are leveraging service virtualization with proper test data configuration as an approach to minimize costs in the non-production environment. CA leads this category by offering an integrated platform of CA service virtualization with CA test data manager.

Domain solutions and accelerators

Test data architects in general, focus more on test data services such as data masking, data sub setting, data creation, data provisioning, and more, along with tools and process aids to support the creation of the enterprise test data strategy. However, domain knowledge is key to determine efficient approaches for identifying data subset criteria, masking methods, and data creation techniques to ensure data integrity. Test data consultants should focus on building domain-specific solutions such as:

  • Pre-defined patterns to identify sensitive fields based on the data privacy and regulatory compliance requirements in a specific industry (HIPPA Regulation in healthcare)
  • Pre-delivered data-packages for master data domain in a specific industry (patient, prescriber, pharmacy, etc. in healthcare) for masking and synthetic data generation, which can be integrated with TDM tools for faster and efficient test data delivery

In fact, many service providers are offering joint TDM solutions along with tool vendors. Test data architects should focus on building out-of-the-box TDM solutions for packaged products such as Oracle ERP, SAP, etc. and products and platforms such as Gin insurance, Nasco and RxClaim, etc. in healthcare.

A few product vendors such as Oracle offer tools and packages which support and accelerate enterprise test data management strategy for Oracle products. Similarly, service providers - Infosys, Accenture, Cognizant, TCS, Wipro, and more - offer data masking solutions and accelerators as part of their implementation services.

In a nutshell, TDM solution architects and consultants must be aware of the impact of these trends - agile, DevOps, big data, and cloud - on their existing test data strategy. Armed with this knowledge, they must fine tune their test data strategy with effective solutions such as service virtualization, domain-centric solutions and accelerators for products and platform to incorporate a futuristic vision into their TDM strategy. The underlying objective is to deliver value to clients by increasing automation through efficient solution patterns and innovative methods across the TDM lifecycle.

September 26, 2016

Hadoop based gold copy approach: An emerging trend in Test Data Management

Author: Vikas Dewangan, Senior Technology Architect


The rapid growth in data volumes of both structured and unstructured data in today's enterprises is leading to new challenges. Production data is often required for testing and development of software applications in order to simulate production like scenarios.

In most cases, enterprises have a gold copy of this data where it is stored and masked before moving it to non-production environments. Having a gold copy helps in the ability to mine this data and choose the optimal subset or specific records for migrating to the target non-production development and testing environments. With increasingly larger volumes of diverse data in production environments, the volume of data stored in these gold copies is becoming correspondingly massive . As enterprise class storage media (e.g. SAN hard disks) is quite expensive, IT departments are finding it more and more difficult to justify their investments for additional non-production data storage such as a gold copy. The business may regard these as non-critical investments. Enter Hadoop.

Hadoop benefits

Hadoop provides a highly scalable and cost effective framework that enables distributed data storage and processing. Hadoop runs on commodity hardware as compared to enterprise class storage media; various industry estimates peg the disk storage savings of the former as anywhere between 3 to 10 times  as compared to enterprise class storage media. Given this saving, it makes a lot of sense to consider leveraging Hadoop as a solution. Further, Hadoop can be considered for providing a centralized gold copy for the entire application portfolio of the enterprise. Another key consideration for selecting a platform for gold copy is that a future-proof architecture is required that can support various types of test data including RDBMS, semi-structured (e.g. flat files) and unstructured data with high scalability. This is an area where Hadoop shines.

Mechanisms for data loading, refresh and mining

Some key requirements of a gold copy from a test data management perspective are: (1) the ability to ingest data from production to the gold copy, (2) the ability to carry out a full or selective refresh from the gold copy to the target data stores and (3) the ability to mine and identify required test data records against various test cases.

Unstructured and semi-structured data can be directly copied on to HDFS (using file copy operations), as there is generally not much need for carrying out any mining / querying on this data. For structured data, as there is a need for mining and full or selective extraction (sub-setting), we need an effective mechanism of storing this data on Hadoop. While there are several technological alternatives available in the Hadoop ecosystem to achieve this, let us look at a popular approach being adopted by the industry - which is using Hive. Hive enables users to write SQL like statements to analyze and query a Hadoop data store using HiveQL (Hive Query Language).  To load data into Hive from RDBMS, we recommend leveraging Sqoop. Sqoop supports both import and export from a large variety of RDBMS types. Using Sqoop, these records can be selectively refreshed into the relevant test or development environments. Alternatively, using relevant filtering criteria, an optimal subset of data can be refreshed. For refreshing unstructured and semi-structured data into the test and development environments from the Hadoop based gold copy, HDFS file copy operations may be used.

A conceptual figure of a traditional data store and a Hadoop based intermediate data store is presented below:


While Hadoop has been around for a while, it is an emerging trend that Hadoop is being leveraged as a test data store by enterprises today.

Implementation steps
The key steps in implementing a Hadoop based gold copy approach are: 

  1. Analyze: Includes understanding the current application and environment landscape in terms of type and volume of data, database technologies involved, test data needs, reusability of test data, refresh frequency required etc.
  2. Design: Define the high level solution architecture of the Hadoop based gold copy data store. This will include the mechanism for data ingestion, mining and refresh of the target environments. The design should consider all types of data including unstructured, semi-structured and structured data. Design needs to include how the data will be maintained and kept current.
  3. Setup and configure: Includes provisioning the hardware, setting up the Hadoop cluster, configuring the key aspects (like data ingestion using Sqoop and tables using Hive)
  4. Roll out: Will involve the initial data load of the Hadoop cluster and provisioning data. It is advisable to start with a pilot for a few applications to iron out potential issues.
  5. Expansion: After the initial roll out, the solution can be expanded to other applications and portfolios across the enterprise. In addition, a strong focus should be kept on continuous improvement of the solution.

Conclusion
In summary, this solution works by loading production data from diverse platforms into Hadoop, which serves as the gold copy for data storage. Identified records are migrated to the target test and development environments, which have the same technology platform as the production environments. The key benefits of this solution are that it is cost effective, highly scalable and can handle a wide variety of test data types. All the major solution components mentioned here including Hadoop, Sqoop and Hive are open source. Considering the benefits, several enterprises today are evincing keen interest in leveraging Hadoop as a test data store. There is undoubtedly a strong value proposition to embrace this emerging trend and move to a next generation test data management (TDM) solution.

September 22, 2016

Data validation nuances for unifying Omnichannel data

 Author: Naju D. Mohan, Delivery Manager

Gone are the days when collecting and processing data gave CIOs sleepless nights. Now, they are kept awake by another data challenge -- an abundance of data overflowing from multiple channels that must be analyzed to derive meaningful insights. In fact, consolidating customer data from various sources to drive uniform communication across channels appears to be an uphill task, as it involves merging online and offline data to track customer interactions and understand them better to provide a seamless experience to them.

Omnichannel shopping experience for millennials and centennials

The millennials, who grew up with the internet at their fingertips, are definitely online and mobile-savvy customers. However, they also frequent stores (because who can resist the urge to touch and feel products before you buy them?). They demand an effortless transition from smartphones to computers to physical stores so that they can have their best pick of products from retailers. The centennials who never had an existence without being internet connected, are still a mystery to most retailers. They are not a dominant earning group yet, but their spending trends prove that they are visually-led and that they flip across channels to make their final buy.

Big data insights and omnichannel shoppers

New generation shoppers do not shy away from providing retailers a huge library of personal data. In fact, they regularly share their viewpoints, likes, and dislikes along with their shopping preferences. This provides retailers, ample data to understand their behavior, personal choices, preferred mode of engagement, reasons for abandoning a purchase midway through the shopping journey, and so on. Retailers need to process this customer data to provide an enriching omnichannel shopping experience to these young people - who don't hesitate to make use of the advantages that big data analytics provides.

The latest trend among shoppers is that, although they are influenced by peers and online media and are open to change, they are willing to listen and keep coming back to a brand or a company once they like it. Retailers should thus derive insights from the vast amount of customer data and design personalized marketing campaigns and loyalty programs that are more visual and which customers are able to connect with.

Validation to guarantee quality insights for omnichannel user experience

Poor data quality is one of the primary reasons why retailers struggle to keep pace with consumers' omnichannel expectations. They most often fail to provide a consistent experience across the various channels due to inconsistent data. To address this, they need to move away from isolated systems, which cater to individual channel's needs, towards a truly digital ecosystem that integrates all channels. We can take a look at common data quality issues that retailers face and suggest a validation strategy that they can adopt to address these issues.

The test strategy should focus on three primary areas:

  • Address the volume of data for testing, through appropriate techniques
  • Verify the proper integration of data from various channels
  • Ensure integrity of data across all channels

The data that gets collected across the channels and its huge volume may make the retailers feel a little lost. Additionally, the fact that this data gets multiplied every millisecond further challenges companies and can leave them wondering about how to derive any meaningful insights from it. The data variety could include online and offline marketing and sales data, social media profiles and behavioral data of customers, online browsing history, etc. Identifying the right data is what that matters here. Data filtering testing should concentrate on validating and making sure that relevant data is being extracted from across the channels and stored for analysis.

The success of retail businesses depends on the availability of relevant customer information with their in-store employees, call center operators, and ecommerce managers. This helps retailers provide more personalized product recommendations that are more customer-centric and saleable. A proper, customer data integration validation approach, focusing on match-and-merge, alongside updating the right customer attributes helps maintain actionable strategies for the retailer. This will help them increase their sales opportunities, profitability, and customer loyalty.


A holistic integration between internal operational systems and external customer-facing systems is a primary need in today's digital age. Ensuring this integration, requires detailed testing to ensure data integrity across systems from various departments like marketing, sales, loyalty, human relations, etc. This ensures the preservation of brand integrity and confirms that all departments honor the brand promises that are made to the customers. This data integrity validation has to be an ongoing process, in order to achieve true brand integrity, credibility, and authenticity.

Conclusion


Omnichannel has huge potential for retailers. However, it can be properly exploited only if retailers are able to filter out the noise from the available data, enrich the customer data to provide a rich user experience across all channels, and integrate the data available across channels to provide a consistent shopping experience. This ultimately boils down to keeping the retailer data in order.

September 13, 2016

Lift the curb on Performance Engineering with Innovation

Author: Sanjeeb Kumar Jena, Test Engineer

In my previous two blogs, I discussed about bringing pragmatism and a cultural mind-set change to performance engineering teams. In this blog, we will evaluate the outcome of these shifts during the transformation journey from performance testing (only limited to quality assessment phase of software applications) to performance engineering (covers the entire life-span of software applications to ensure higher return on investment).

For performance engineering teams, the return on investment (ROI) is the long-term brand value (that is measured through revenues and lasting customer relationships) and end-user satisfaction (that is the result of higher performance at scale).

However, these metrics are not new to the regular performance testing (PT) approach that uses agile methodologies such as Scrum, Sprint planning, continuous reporting and feedback, etc., to achieve the required ROI.

Then, Why is innovation required here? 
 What does innovation mean here?
 How do we apply or better create a platform for continuous innovation in performance?

Why is innovation required?

From the early ages of written language to the era of the Internet of Everything (IoE), communication between two entities -- producer and consumer -- is the de facto enabler for any business. Producers (aka business owners) send their value (information on services / products offerings) to their consumers through communication channels. The higher the performance of the channel, the better the consumer growth, hence the greater the service and business growth.

Nowadays these communication channels have taken the form of consumer-driven applications that are available on every type of device. Consumers are using apps on desktops, mobile phones, wearable devices, and cloud-enabled physical appliances (IoT devices); anytime and from anywhere.

The exponential growth in computation power with lower price / performance ratio and democratized low-cost manufacturing have catalyzed billions of online users who access these applications.

Rapid implementation of open source technologies in application development and delivery has shifted the customer relationship paradigm towards a stack of diversified technologies.

To keep pace with this exponentially growing trend, in both customer accessibility and open source application development paradigms, the performance engineering market is also accelerating at a similar rate.

In the current knowledge-sharing and data-driven economy, performance is not only limited to continuously testing your application against an expected load (number of users) and updating it in an agile way, but performance means analyzing  customer behavior (evaluate how people are using the features) to ensure scalability and reliability. So, today's performance engineering tools include data analytics, machine learning, artificial intelligence (AI), code-level diagnosis, application performance management (APM), operations management, automation, etc.

That's why, the available tools and technologies (open source and enterprise) in this landscape are growing at an exponential rate. Numerous vendors compete with tools that provide approximately similar functionalities.

So, how does an industry choose the better one?

Surely evaluating performance capabilities of tools through existing customers' feedback or listing of features is not enough. What fits the bill is to validate the tools for the application in the industry's environment in a rapid experimentation approach. If not available, enterprises can build a new solution on what's already available.

This is where innovation comes into place.

Innovation is not about inventing something new out of nothing; instead it's an experimentation approach where you try out different options, integrate the best ones, and build the framework / tool that works for you.

Innovation is the creative process, not the end result. As Steve Jobs mentioned, "Creativity is about connecting things," so innovation is about experimenting and finding the pieces to be connected.

It's a fact that the current approach towards performance engineering ensures a steady state in customer value. However, in order to keep pace with rapidly growing technologies and methodologies in the performance industry, it is advisable to create a platform for innovation to Be More.

How do we create a platform for innovation?

Clayton M. Christensen in his book titled 'Innovator's Dilemma' mentioned that large companies have certain barriers to innovation which make it difficult to invest in disruptive / emerging technologies early on, because they have an 'established customer base,' to which they must stay accountable. These customers often ask for better versions of current products rather than opting for new technologies. Thus, the company wants their resources to focus on activities that address customers' needs and promise them higher profits, technological feasibility, and play in existing markets.

So, in order to enable innovation (experimentation process) or solve an innovator's dilemma, we need to engage in discovery-driven planning, in which we operate on the assumption that new markets cannot be analyzed and instead rely on learning by doing (rapid prototyping / experimentation) and real-time adjustment of strategy and planning.

The same concept can be applied to bring the innovation spirit in performance engineering.

First, it requires an environment that provides an opportunity for teams to experiment with many tools / technologies (open source or enterprise) in PE industry. The managed cloud environment (Amazon Web Services, Microsoft Azure, and OpenStack) is a cost-effective and easy-to-setup solution through a protocol of try-fail-learn-again-try-success in a rapid iterative manner (rapid prototyping).

In  coming blogs, we will discuss on the process of setting an online experimentation lab for performance engineering on cloud.

The Infosys approach to innovation

Infosys is rolling out its innovation strategy, called the AiKiDo principle, across its delivery units. It is combining AI-based knowledge discovery, platform automation, and process empowerment with Design Thinking. The same AiKiDo principle can be used for innovation in performance engineering teams. It is driven by three main elements -- machine learning based predictive analytics, DevOps or automation, and Design Thinking.

First, machine learning based predictive analytics can be applied on data generated by load testing tools and the application logs. Instead of targeting values of defined performance KPIs, this approach can build new predictive models based on newly found patterns. Knowledge discovery in performance is a competitive advantage over defined reports.

Second, the DevOps or automation approach can make the validation process effective and less time consuming. Automating server orchestration, data management, and log management and integrating performance testing in application continuous integration (CI) / continuous development (CD) pipeline can reduce the time-to-market with higher performance management.

Third, Design Thinking simply means a way of problem solving by using a mindset of 'building for people' through empathy and collaboration. Performance engineering is more about collaboration amongst all stakeholders who consider performance as key goal from the early days of development to deployment.

Innovation happens when everyone participates in problem solving or solution design process. When people are working on something exciting that can impact other lives positively, the need to promote innovation strategies becomes redundant. That's because the vision and contribution mindset pulls the urge to innovate in the team.

So it's really important to build a platform for innovation -- an online experiment lab where engineers can try out many solutions (instead of just few) in a rapid prototyping environment and choose the better solution that's validated through facts or knowledge.