Testing Services provides a platform for QA professionals to discuss and gain insights in to the business value delivered by testing, the best practices and processes that drive it and the emergence of new technologies that will shape the future of this profession.

« September 2015 | Main | January 2016 »

December 28, 2015

Elevating QA to CXO

Author: Harleen Bedi, Principal Consultant

With rapidly evolving and emerging technology trends such as Cloud, Mobile, Big data, Social, etc. QA is now being looked as a key component in the modernization and optimization agenda of any CXO. This is supported by - World Quality Report that reveals that Application Quality Assurance and Testing Now Accounts for almost a Quarter of IT Spending. 

However, the reality is that many organizations still do not demonstrate the business value of QA to the wider business, as they still mainly capture and report operational information such as -

  • Number of defects found (73 percent)
  • Cost per test case (55 percent)

Traditional QA Drives Application Excellence through these steps -

  • Understands Application Requirements
  • Validate whether Application meets Requirements
  • Measure and report the Quality, Cost, Time (QCT) goals of the QA function
  • Key stakeholders are Development teams and functional managers

This provides Limited or Narrow view available to CXO on QA's contributions...!

Where, the CXOs are focused on - ROI, Value to business, Innovation,  Efficiency and Reduced Risk, to name a few. Thus the key to communicate the actual value of quality assurance is explaining QA contributions in a way executives understand and can relate to rest of the business.

How to Bridge the Gaps?

  1. Define clear Mission statement for QA
  2. Linking QA and testing to achieve Business outcome
  3. Include strategic metrics in QA reports

1. Define clear Mission statement for QA

Before start of QA activities, define mission statement that outlines the goals of QA for each project. Share mission statement with management and deliberate the key likely concerns of the CXO for a specific initiative/program and consider these factors while defining QA goals. E.g. - Goal of QA is to improve data validation quality to avoid regulatory penalties. CXOs can easily appreciate this goal rather than a goal that says to have a defect free application.

2. Linking QA and testing to achieve Business outcome

Multi-dimensional change needs to be brought in terms of Governance, Delivery model and technology.

Governance: QA strategy with specific policies that are aligned to the business needs

Delivery Model: Moving from an allocation-based model to a charge-for-services model

Technology: A highly standardized technology foundation based on virtualization and cloud

Let QA integrate with all the stages of lifecycle to drive business goals...

  • QA feedback to assist in Business Process Optimization and Redesign
  • Early validation to integrate with development lifecycle
  • "Shift Left" strategy and "Test Early Fail Fast"   approach to reduce Time to market
  • Create/ Use Integrated tools and platforms to cover entire SDLC
  • Risk-based techniques to measure and minimize risk in releases
  • QA assets (test scripts) to assist in production monitoring
  • QA dashboards to assist in release decisions

3. Include strategic metrics in QA Reports

  • Cost Effectiveness
  • Stability to business
  • Reduced Risk Due to QA
  • Reduce Time to Market
  • Brand value

Don't talk Technology, talk business outcome in your QA Reports. QA reports / communication should highlight...contribution of QA to reduced time to market.

  • How by involving early in life cycle the defects are uncovered earlier
  • How to automation effort has reduced the cycle time

Cost savings by preventing defects

  • How Requirement reviews, design review and code reviews contributes preventing defects

Reduced business risk

  • How due to various kind od validations it ensures regulatory compliance
  • For business critical applications, how loss is prevented due to QA efforts

Translate Technology Value of QA to Strategic Business Value, as shown in few examples below -

  • Improvement in test cycle time for enterprise customer intelligence data warehouse facilitated increase in sales
  • Improved system availability of order management process led to incremental revenues
  • Increased system availability through proactive performance testing  in the Risk Management portfolio helped in reducing risk
The Ultimate goal should be for QA to become a key constituent in driving Business Goals

You can also access this blog on LinkedIn.

December 14, 2015

Ensure the Quality of Data Ingested for True Insights

Author: Naju D. Mohan, Delivery Manager

I sometimes wonder whether it is the man's craze for collecting things, which is driving organizations to pile up huge volumes of diverse data at unimaginable speeds. Amidst this rush for accumulating data, the inability to derive value from this heap of data is causing a fair amount of pain and a lot of stress on business and IT.   

It is under this context that we need to take a step back and look at data ingestion and in particular, the processes adopted for bringing in, importing, transferring, or loading data into the big data ecosystem. This simple but deceptive concept has to address the wide gamut of data formats, update frequencies, volumes, and variety of ingestion mechanisms. The data ingestion process has to handle batch data to streaming data ingestion tools and technologies, in addition to multi-destination ingestion.

Quality of incoming data and traditional data storages

In the world of traditional data warehouse, we have mainly dealt with structured data. Most of the times, the data was extracted from operational systems and transformed to adhere to the business rules. For analytical purposes or for further downstream processing, this data was loaded into a data warehouse. Terabytes of data was talked about as huge volume in the world of data warehouse. In enterprise data warehouse, operational data stores and traditional data storage mechanisms, the emphasis of data quality check started at the point of data entry. This was justified because the impact of bad data on business was much more severe when compared to the cost spent on cleansing the data during data acquisition.

Today, we are talking about zetabytes of data and this doubles almost every 1.2 years. With this massive increase in the volume and velocity of data generation, the need to transform and cleanse the data at the point of import has been compromised. We have various business situations where we acquire data through batch processes, with real-time data acquisition mechanisms or even streaming of data into the big data ecosystem.

Is incoming data quality still sacrosanct for the modern big data systems?

Big data ecosystems do not necessarily require record-by-record data quality validation during data ingestion. Let us take a sneak peek at the three primary ways to ingest data into a big data ecosystem and the data quality checks to be executed for safeguarding the business value derived from data at rest as well as from data in motion. 

data-ingestion.png

Batch data ingestion

The data from various source systems are typically available as files. These files can vary - text, binary, image, etc. Once these source files are available, they can be ingested into the big data system with or without doing transformation at the point of ingest. This becomes an efficient form of processing huge volumes of data, accumulated over a period of time, which is typically the case in big data implementations.

Real-time data ingestion

Real-time data ingestion is not about storing and accumulating the data and later, batch processing it to move into the big data system. Rather, it deals with moving data into the big data systems as and when it arrives. Just like, there is an argument that there is no such thing as totally unstructured data, a similar saying goes that there is nothing such as pure real-time data ingestion. Real-time data ingestion stresses on the fact that the data is ingested in the present and not in the future. The term real-time varies from an Online Retailer to a Wall Street Broker to Aircraft Controls.  

Streaming data ingestion

This form of data ingestion is very similar to real-time data processing, but data is processed based on incoming data flows. The data would be continuously flowing in and insights are generated as the data flows in. This is often necessitated by the business who wants to move away from the paradigm of just reporting an incident, to predicting events, and ultimately to changing the outcomes. 

Quality validation to avoid data ingestion becoming data indigestion

Each batch carries a huge volume of data and hence, any failure in ensuring data quality of the incoming batch data would wreak havoc with accumulating volumes in future batches. I am listing down a few recurring data quality issues which I have observed in batch data ingestion.

  • Failure of a few jobs in the entire incoming data flow could impact data quality. It has to be checked whether the entire batch data has to be discarded or if there are selective approaches to process the data.
  • Validate the methods adopted for data acquisition and storage, since the way the same data gets stored in a relational database to a file and finally in a NoSQL Data Store could cause data corruption.

Business often demands quick and smart data analytics. This necessitates real-time data ingestion, which often deals with massive volumes of data that degrades in value if not consumed quickly. Most of these situations would demand data transformation and enrichment, before it is loaded into the big data systems. To avoid data quality degradation during real-time data ingestion, the below mentioned common data quality validations have to be followed:

  • Detailed traceability tests to source systems become cumbersome and cost-ineffective due to data transformations. This include necessary statistical validations that needs to be taken to ensure error-free data movement into big data landscape.
  • Data duplication can happen due to accumulation of data from various sources. It is necessary to ensure that one has performed the data de-duplication validations.

Sometimes, business decisions are made based on the log data from various sources or event-based data from various systems, which necessitate streaming data ingestion. The presence of even a minute error in the incoming stream of data would impact the real-time dashboards, related analytics, and operations. Commonly used data validation approaches to address the frequent data quality issues encountered in streaming data ingestion are listed below.

  • The format of incoming streaming data should be easily comparable with existing historical data for meaningful insights. Validate the data format of various streams to avoid data mis-representation.
  • Business rules like calculation of running averages in the incoming data stream have to be validated.

Conclusion

We reap what we sow and hence, the acceptability of big data insights would primarily depend on the quality of data used for deriving those insights. We have to walk through the customer journey and decide the strategy to validate the incoming data quality along with the velocity to process the incoming data. In summary, we should collect and process only that data which drives action and leads us in the right direction.