Testing Services provides a platform for QA professionals to discuss and gain insights in to the business value delivered by testing, the best practices and processes that drive it and the emergence of new technologies that will shape the future of this profession.

« August 2012 | Main | October 2012 »

September 28, 2012

Program managing multi-vendor QA teams - It is an Art!

If you take today's QA organizations, the presence of a multiple vendors/partners is a given. While this may be beneficial to organization from risk mitigation and cost efficiency perspective, it does pose significant challenges from a vendor/partner management perspective. This challenge gets further aggravated given the increasingly complex nature of IT programs and the short turn-around time demanded by business.

In such time-constrained and high stress environments, managing multi-vendor QA teams to achieve the common goal of zero defect leakage to User Acceptance Testing [UAT] and Production becomes harder and harder. In most cases, client QA organizations end up spending significant and valuable management time in resolving inter-person conflicts, rectifying communication deficiency between vendors, innuendos of favoritism, or in explaining to business stakeholders why the productivity of QA is low.

So, given these challenges, should QA Organizations look for a single-vendor strategy? Not at all. Multi-vendor/partner situations are here to stay. What organizations need to find is the best collaborative platform to ensure that they get the maximum from the partner set up that they have created.

From my experience of leading multi-vendor QA teams at several organizations, I have noted down few best practices that I believe can help you in this journey of maximizing value from your partner eco-system. According to me these are simple and common sense guidelines and have worked like a charm wherever I have implemented or adopted them.

Setting up a structure and process for the QA team:

Approach: Having a foundation for how the QA team should work internally and interact with other stakeholders such as Development, Build, Interfacing and Business teams is vital. Think through all likely interaction scenarios and draw simple swim-lane diagrams. Prepare a self-explanatory slide deck and share with the QA team and stakeholders to set expectations upfront. Hold lessons learnt sessions and implement action items from these sessions to foster continuous improvement. Ensure that you engage with client management, as well as leadership teams of all the vendors, to gain buy-in on the common process and structure, highlighting benefits to the program and thereby to each team member.

Benefits: Publishing structure and process for the QA team sets expectations and standard operating procedures such as stage-gate criteria clearly with stakeholders. This helps to send a common message for the QA team, irrespective of vendor affiliation.

 

Keeping program objective as the primary goal for the QA team:

Approach: Document the Program Test Plan with defined objectives for QA team, such as ensuring all requirements are covered with test cases, execution of all test cases prepared, retest and closure of all critical/ major defects detected during test execution. Also cover all dependencies the QA team has such as test environment, access needs, and test data. Publish and procure approval from stakeholders.

Benefits: Upfront planning, documentation and seeking approval from stakeholders on a Program Test Plan ensures all stakeholders are on the same page, and clear about entry and exit criteria expectations, risks, dependencies and objectives for QA team to be measured on. This ensures that the team members from multiple vendors march towards the common program objectives for the QA team.

 

Motivating teams through mutual respect:

Approach: Back to basics! Treat each team member with respect as an individual. Respect can go a long way towards motivating team members. Use data based decisions, avoid prejudices and favoritism, and suspend perceptions, while interacting with team members from multiple vendors. Display fairness and sensitivity in all interactions. Build trust with team members.

Benefits: Treating each and every team member with respect and fostering a culture of mutual respect motivates the team, aligns them to the QA team objectives thereby building a cohesive team. A cohesive team helps you to cross train and improve productivity.

 

In conclusion, the above tenets help prevent chaos and frictions associated with a multi-vendor QA team. If an organization leads from the front practicing these best practices, I am sure they will be well on their way to achieving their objectives of the program and satisfying business stakeholders.

 

September 24, 2012

Warehouse Management System (WMS) "System test" automation

A warehouse management system, or WMS, is a key part of the supply chain and primarily aims to control the movement and storage of materials within a warehouse and process the associated transactions, including shipping, receiving, putaway and picking. Most of the Retailers use third party products like Manhattan, Redpraire etc to handle the WMS business process flows.

Automation is a key activity to consider to reduce the regression testing effort and improves the test coverage thus improves the quality.

It is a well-known fact that automation methods are Data driven, Key word driven and Hybrid. In a traditional data driven approach, the input data file contains standard data consumed by the automation scripts either from environment configuration parameters or the standard look up parameters.

In WMS third party products, there are a plethora of application configurable parameters and they keep changing and hence a need arises to ensure that the automation scripts do not fail frequently and cost of maintenance is low.

Key considerations to build Automation;

1.     Selecting an Automation Framework:

·         Approach: Selecting an automation framework is the key. Use Modular approach. Divide the test scenario into multiple components and functions. Use descriptive language to address frequently changing objects.

·         How it helps: Particular component / function pertaining to a change will be modified and the automation scripts are ready to execute. Object names are used at run time to minimize the impact

2.     Configuring the Automated WMS application Parameters: The application configuration parameters like Inventory adjustment reason code, Policy for BR code, Policy for RR code etc will change frequently with in DC or across DCs. There is a need to ensure that the automation scripts do not fail when the parameter value does not exist as an input parameter.

·         Approach: Maintain the configuration parameters required for an automation script to run in an input file. Automate the retrieval of configuration parameters from WMS system and compare with the input file. Assuming that the application config parameter is updated such that any of the available options for a parameter in the dropdown gets changed or removed. In such a situation, the config parameters are refreshed by the script and data sheet is updated with the appropriate test data. The scenario execution done after the above mentioned process will ensure successful run.

·         How it helps: Reduces the Automation script failures and improves the availability for frequent runs

3.     Automated Test Data Management approach:

·         Approach: Use Test Data Sheets as Data inputs to Automation Scripts. Automate the retrieval of Master and Transactional Data Required for the Automation scripts by directly connecting WMS System database before the each run.

·         How it helps: Speeds up the Transactional Test data creation process, avoids failures due to incorrect Test Data and Restricts the need for Data Architects

·         Last but not least, domain knowledge is critical for an automation expert or bring in Automation expert and domain expert together for successful WMS system test automation.

In conclusion,  a well define WMS Automation framework considering the automated test data management and automated application configuration parameters  ensures no script failures due to data inconsistencies but also reduces the maintenance cost and improves the test coverage.