Testing Services provides a platform for QA professionals to discuss and gain insights in to the business value delivered by testing, the best practices and processes that drive it and the emergence of new technologies that will shape the future of this profession.

« December 2011 | Main | February 2012 »

January 27, 2012

STC conference by QAI

I recently attended the STC conference by QAI that was held in Bangalore on Dec 1st and 2nd. It is one of the largest international testing conferences in India and this year's theme was 'Testing Enterprise 2.0 - Getting Ready for the Quantum Leap". I was particularly looking forward to the insightful sessions by leading testing leaders and practitioners. There were some sessions that were conspicuous with their new perspectives.

The conference began with a thought provoking session on "The New Age QA delivery models - A Quantum leap in the way testing is Delivered and Measured" by Manish Tandon, Global Head - Independent Validation and Testing Solutions, Infosys.

The session highlighted the need for QA professionals to think out-of-box and look beyond the traditional boundaries of QA. He began interestingly with an analogy that a testing practitioner needs to be charged like an electron and keep progressing from one level to another. The session focused on understanding the importance of the business value of QA and its alignment to an organizations business goals. He cited some relevant and captivating examples that demonstrated the importance of business value in testing and the need for specialization in testing. To read more about the session, please visit http://www.infosys.com/newsroom/events/pages/annual-international-STC-2011.aspx.

Another noteworthy session was by Sidharth Malik, Sr. Director - Developer Platform Business, Microsoft who emphasized the need to integrate several testing tools into a single package for greater testing efficiency. His session also stressed on the fact that QA professionals need to give detailed information on a defect which would facilitate a much faster and accurate correction by developers. Manish Gupta's (Business Manager, HP) session on "Modern Application Delivery - Structuring a Testing Approach" emphasized on the need for the modernization of the testing mindset, attitude and the integration of automation, which should also be a default skillset for today's testing practitioners.

At the stalls, Neoload, a performance testing product for the cloud, attracted a lot of attention. Another product that interested me in particular was the Eggplant, an automation tool for GUI testing that would support mobile applications and has many good features.

January 20, 2012

Testing for cloud security - What is the data focus of QA teams (Part 2/3)

In my early blog on testing for cloud security (http://www.infosysblogs.com/testing-services/2011/12/testing_for_cloud_security-_wh.html), I had discussed the security concerns of cloud adoption from an infrastructure standpoint. Now, let us take a look at what would be the focus of cloud security testing from a data perspective. Enterprises are highly concerned about the security of their data in the cloud. They are well aware that any sort of data security breach could lead to non-compliance, resulting in expensive legal law suits that could cause long term damage to the overall credibility of the organization

So from a data perspective, Cloud testing would need to focus on access controls and privileges that would in turn ensure that there are no loop holes for any accidental or intentional misuse of data. As clouds are shared environments which could mean one organizations data would be present along with another organizations data, as a first step, it is absolutely necessary that the data residing on the cloud is tested in isolation. As compliance rules and data privacy regulations vary for every region, special attention needs to be given to this aspect. Data testing on the cloud would also need to lay emphasis on scenarios where data needs to interact with legacy or existing non-cloud applications and scenarios that involve data migration. Ensuring the encryption of data under migration and residing on the cloud is also absolutely essential to ensure no leakages or misuse of data whatsoever.

January 18, 2012

Testing Infrastructure - Ignorant or Innocence

While I was thinking about what else can I do to improve testing efficiencies and reducing cost, I asked myself a basic question, what are the fundamental elements which constitute a testing function? After giving a minute to myself, I could quickly pen them down. The three key elements of a testing function are Requirements (Based on which IT system is developed or modified), Testers (Build test plans and execute them to verify concurrence to requirements) and Testing Infrastructure (Test servers, mainframes, middleware components, test data, test labs). In the last 10 years, Testing organizations are focusing profoundly on improving every aspect of testing lifecycle to improve efficiencies by reducing overhead activities. This includes improving requirement stability index (Introducing Requirements Management tools, processes) and testing effectiveness (Productivity improvements, Advance test methods, etc). But when it comes to testing infrastructure (TI), either the status quo has not been challenged much or Infrastructure is not aligned with the Quality goals of the organization. Quality is in fact compromised due to lack of insufficient TI. In many cases, TI is squeezed to an extent, that it can create more than 25% of quality issues in production. You might have noticed one or many of the following in most of the organizations

·         Test environments are limited and do not have connectivity to many integration systems

·         DBA is not available to support test databases

·         Test Data is not refreshed on need basis or test data is not available to test

·         24/7 support is not available for test environments

When I started asking these questions to myself as the possible reasons for this situation, I started listing down and few important ones are

·         Lack of necessary and skill complete understanding of the infrastructure components within the testing organization

·         Need extensive support from organizations external to testing

·         Testing Infrastructure is expensive

With this background, in this paper, I have attempted to make a deeper dive to uncover TI challenges, suggested strategies and recommendations on ways to improve effectiveness and efficiency of TI.

Understanding Test Infrastructure needs and current habits:

Testing organizations are suppose to perform at least the following functions related to Testing Infrastructure

·         Testing Infrastructure Blueprint maintenance

·         Utilization Management

·         Downtime Management

·         Monitoring usage of Testing tools & Licenses, Test data sub setting and provisioning

·         Monitoring testing Infrastructure support

In order to better understand how various testing organizations are managing testing Infrastructure, I collected answers to below questions from my colleagues working for 25+ different testing organizations. You can use this questionnaire as a starting point to better understand the state of your organizations testing infrastructure

Sl#

Question

Yes/NO %

Comments

1

In your testing organization, do you have a nominated role responsible for test environment management? (SPOC for test environment related management)

YES - 75%

NO - 18%

NA - 7%

Even though 75% said yes, most of the respondents particularly highlighted the inactive role played by testing organizations in this capacity

2

Testers and Test leads have a GOOD understanding of the Testing Infrastructure (Hardware, configurations, connectivity, DB, etc)

YES - 33%

NO - 67%

Majority of the respondents felt that testers have understanding of their scope and have little clarity of Hardware or configurations

3

All test environments (Server, DB, connectivity) have 24/7 support

YES - 15%

NO - 85%

All the respondents work in offshore model 

4

Are you collecting environment downtime data for all test environments?

YES - 90%

NO - 10%

More than 50% of the respondents highlighted that no action has been taken on the data collected

5

If data is being collected, has action been taken to reduce the downtime?

YES - 43%

NO - 57%

This is a Quick win initiative

6

Does your test environment have end to end testing capability to simulate 80% of the production scenarios?

YES - 18%

NO - 82%

Majority of the respondents do not believe that this scenario results in Quality issues

7

Does your testing organizations conduct test environment audits to check for deviations after every 3 releases?

YES - 4%

NO - 60%

NA - 36%

Majority of the respondents were not aware of any audits

8

Do you have test data refresh strategy for all test environments?

YES - 21%

NO - 79%

Several respondents highlighted initiatives currently in progress for implementing data refresh strategy

9

Do you use virtualization or any other environment optimization techniques currently?

YES - 12%

NO - 33%

NA - 55%

Majority of the respondents were not sure about environment virtualization initiatives undertaken if any

10

Are their testing infrastructure improvement programs currently in progress?

YES - 30%

NO - 60%

NA - 10%

Even though 30% respondents said yes, less than 6% had clarity on the initiatives being undertaken

 Note: The data collection was questionnaire based and no interviews conducted. The above table is used to provide clarity on various TI topics. The accuracy of data is evaluated only by the author

Testing Infrastructure - Key Functions and Recommendations:

1.       Testing Infrastructure Blueprint maintenance:

TI Blueprint is the map of all test environments used by testing organization. Blue print provides two dimensional view of various testing hardware, application interfacing points, middleware components, databases, 3rd party application connectivity, etc. The advantage of creating and maintaining this blue print is to get a good understanding of how many test environment you have per application and how robust the TI of the entire organization.  It must also be used for understanding failure points, test data flow patterns for data refresh and data masking, TI support needs, capacity planning and isolating TI bottlenecks for testing activities.

Recommendation is to audit and update the TI blue print every 3 months. This activity calls for collaboration between Testing and IT Infrastructure support teams.

2.      Utilization Management

Testing Infrastructure utilization management is a function to maximize TI usage at reduced cost while ensuring improved application Quality. If you ask this question to your test managers as to how effectively you are managing TI utilization for your application, 99% of them will say, it needs significant improvements. I have listed few reasons for this behavior

·         The most important aspect of utilization management is to ensure that we strike the right balance between TI optimization and Quality of the application performance in production. This correlation is missing

·         The above correlation can be achieved by formation of cross functional team including infrastructure support team and testing organization. The objective of this cross functional team is to study test coverage issues, production issues resulting from TI, data provisioning issues, support issues, 3rd party application issues which results in Quality issues in production environment. This function is rarely performed.

·         ROI from testing tools at an enterprise level is rarely performed. Many tools are bought and remains on shelf with sub optimal utilization

 

3.      Downtime Management

This is the most simplistic metric that any organization can collect. As majority of testing organizations work 24/7, down time has direct impact on testing schedule and testing productivity. While most of the organizations are trying to reduce unscheduled down time, it is important to reduce planned downtime as well. Down time data provides a good view of Infrastructure robustness and skill of support function

4.      TI Support, Data and Testing Tools

Infrastructure support is the backbone for utilization and downtime management. In majority of the situations, testers are heavily dependent on developers, DBAs for TI support. Based on challenges faced in your testing organization, it is recommended to encourage your testing community to have a better understanding of your application testing infrastructure to perform first level TI issue assessment.

Your ability to increase test coverage is dependent on how good and relevant data you have to test. While there is no silver bullet to solve data provisioning issues, it is important to attempt a solution which can be a combination of standard tools and home grown customized macros to support your application needs. Investments in test data management should be linked with increase in test coverage and improvements in testing productivity.

Testing Tools are almost always underutilized both capacity and feature wise. Lack of ownership, lack of enterprise direction on testing tools, leads to multiple tools, excessive rework, and skill issues minimizing the ROI.

Recommendations:

1.       Establish testing infrastructure governance which takes complete ownership of above parameters and devises improvement programs which objectively measures efficiency improvements release after release.

2.       This governance should attempt to collect the following metrics

Metric

Comment

% Savings in Testing Infrastructure cost

Annual improvement goal

% Reduction in environment downtime

Annual improvement goal

% Reduction in TI related production defects

Trend by release

% improvement in ROI from investments on tools

Annual improvement goal

 

 Conclusion

Now, let me ask the question again about Testing Infrastructure; are testing organizations ignorant or innocent? After reading this paper, you can agree that we are both. Due to lack of skill, understanding, its limited ability to influence change in TI, testing organizations continues to be innocent. But it is ignorant about utilization and downtime management. While the quest to improve testing efficiency and effectiveness continues, Testing Infrastructure improvements will contribute significantly in the next 10 years. As we can understand from this paper, the depth and breadth of Testing Infrastructure is so diverse that it provides significant improvement opportunities to reduce cost of quality. Hence it is necessary for testing organizations to identify a single owner who can start strengthening the foundation for TI improvements. This foundation is necessary to take full advantage of multiple technology solutions which has evolved in the last 2 years. This includes cloud based test infrastructure, TDM tools, mainframe hosting on windows platform, application performance monitoring and many more. Lack of governance and foundation will be a mammoth impediment and will challenge your test organizations future competitiveness.

 

 

January 12, 2012

Best Practices in Data Warehouse Validation - Test Planning Phase

The complexity and criticality of data warehouse testing projects is growing rapidly each day.  Data warehouses need to be validated for functionality, quality, integrity, availability, scalability and security based on the defined business requirements by an organization. Based on my experiences as a testing practitioner, I believe the following best practices in the test planning phase can significantly contribute to successfully validating the data warehouse.

 

1.    Comprehensively Understand the Data Model

The data architecture and model is the blueprint of any data warehouse and understanding it helps comprehend the bigger picture of a data warehouse. It's also important to understand the methods used for the key relationships between the major and critical data sources. The relationship hierarchies and the depth of data throw light on the complexity of transformation rules. The quality of data and size of data warehouse determines the functional and non-functional testing requirements for validating the data warehouse.

 

2.    Understand the Business Requirements Clearly

It's important to understand the complete business context of the need and implication of data warehouse testing. Mapping the business drivers to the source systems helps increase the testing quality, effectiveness and coverage. Getting the test data early during the test planning stage itself decreases risk and increases the predictability of testing.  The data transformations mapping document is the heart of any data warehouse testing project. Hence, understanding the business rules or the transformation mapping document and running books, early in the testing life cycle, helps the testing team reduce rework and delays in the test preparation and execution phases.

 

3.    Plan Early for Data Warehouse Testing Environment

Planning for test environments based on the quality, size, criticality and complexity of the data warehouse helps reduce the testing cycle. Also, the option of shared environments for the testing teams can also be explored to help reduce costs. Planning for test environments from the following perspectives help decrease the possibility of potential defects:

 

·        Reverse planning from the release date and preparing a high level test execution plan

·        Understanding and documenting requirements, mitigating the risks/constraints in preparing and running the tests from specific geographical locations - such as time zone, availability of environment or systems & access constraints

·        Planning for different types of functional and non-functional test requirements and their test data. Test data plays a vital role in Data Warehouse testing. Planning and preparing for test data early in lifecycle helps avoid the subsequent cascading delays during execution.

These best practices can largely contribute to a successful data warehouse validation. I shall definitely be blogging more on the best practices in the Data Warehouse test preparation and execution phases in the coming weeks. I do look forward to your thoughts, inputs and best practice ideas.

January 9, 2012

The Future of Testing in Financial Industry

The financial services industry in the US and Europe is undergoing rapid changes due to technology advancement, digital convergence, increasing cheaper and newer channels of communication. A few decades back, a typical financial institution just aspired to be the best deposit, savings & loan organization in a particular geography. However, today financial institutions largely rely on technology for growth and to increase their reach beyond their geographical boundaries through new communication channels. Many financial firms are engaging in strategic mergers & acquisitions to diversify their product & service portfolios and to increase their global foot print. Increasing market share through innovative product offering is the path adopted by most firms in this industry today. The current business environment mandates that they keep pace with the technological advancements (mobile platforms, browser standards and tablets) so that they can meet the growing business demands of the industry.

 

These global trends have opened up newer challenges for QA practitioners with complex applications, higher end-user expectations, higher ROI, heterogeneous layers, compliance requirements, mergers and acquisitions. The role of QA has matured & now organizations and projects seek multi skilled QA professionals with expertise in SOA automation QA, e-commerce, performance QA, data warehouse performance QA, mobile network performance QA and information security QA etc.

 

How a multi-dimensional QA model can help address some of these challenges for building tomorrow's financial services enterprise have been spelled out in my latest POV at ...http://www.infosys.com/IT-services/independent-validation-testing-services/Pages/financial-services-testing.aspx. Please share your comments, inputs and feedback. I look forward to them.