Discuss, debate and exchange ideas on latest trends and opportunities in the Business Process Outsourcing (BPO) landscape. Deliberate on adding “business value” to clients, vendors, employees and various other stakeholders to enhance customer satisfaction and sustain long term partnerships.

« May 2014 | Main | July 2014 »

June 24, 2014

Demystifying Master Data Management

Critical master data management capabilities and processes are required to produce data and information of high quality and consistency to make all enterprise applications work together.


Many years ago, a company launched a new product and wanted showcase a trailer to its customers. The excited CEO decided to send a letter to all its key customers to know about the upcoming product. He wrote the letter and asked its executive team to "Make it happen". So they went to CRM, ERP, billing systems to find a list of customers. To their surprise each application returned a different list of customers and no single system had a true view of their entire customer base. The CEO learned about this and was understandably irate. What kind of company does not know about its customers?

Unfortunately many companies face the same issue and do not have precise information about their products, suppliers, inventories and customers. Every time organizations add new legacy applications to different business unit's data integration issues between different legacy systems increases. As a result, the concept of creating a centralized data management framework is of growing importance. A recently conducted survey confirmed that more than 80% of organizations nowadays try for a centralized master data management framework.

In a survey conducted in 2013, 21% of organisation rated their data quality as "high" or better, up from 15% in 2008; most rated it "fair". Also, the number of organisations with no data quality technology at all has fallen from a third to 15%. This suggests that some progress at least is being made in the uphill battle with data quality in large organisations. The proportion recognising that poor data quality is costing them at least $1 million annually has also doubled in the five-year period. However, even today fully one third of the survey respondents are not measuring the cost of poor master data. Some 14% of respondents believed that bad data was costing them more than $10 million annually, up from 10% in 2008.

These numbers clearly speak about a need for a centralized and robust Master Data Management framework.

So what is Master Data Management?

Master Data Management is an organization-wide approach for providing access to Master Data in an integrated unified view, timely available for all users and applications providing a "trusted source of data". MDM furthermore maintains a consistent "record of records" for all master data sets throughout the organization, and includes the necessary policies & procedures to maintain the integrity and sensitivity of master data.

Clients are nowadays pressing the need of having a unified MDM structure. Listed below are some of the benefits that can be achieved by having a robust MDM structure

  1. Universal truth : One single reference for Master Data
  2. Enhanced operational efficiencies
    a. Reduced data redundancy
    b. Predictable & more accurate data flow
    c. Unified framework for acquiring, processing and managing data across the organization
    d. Reduced costs in data maintenance
  3. Increased Compliance
    a. Well defined roles & responsibilities
    b. Well defined processes maps and frameworks
    c. Documented standards, ready to be applied & enforced

Master Data Management (MDM) is a framework of three key tenants i.e. people, process and technology components which work in collaboration to ensure that master data is coordinated across the organization. 

 Vishal_MDM1.jpg


 
 "Data" as we know is the main backbone of any business. Master Data Management also treats "Data" as the primary asset for any organization and defines six key areas for it

Vishal_MDM2.jpg 

When the organization institutionalizes the objective of "treating data as their primary asset" success will be evident in the following ways:

  1. Data Quality
    a. The quality and value of the data can be measured and demonstrated through key performance metrics
    b. Data defects are easily detected and proactively corrected
    c. Redundancy is removed and data is correctly normalized
    d. Improved data quality attibutes to increased business profitability e,g, improved data usage, iproved business opporunities etc.
  2. Data Standards
    a. Data standards are based on relevant data categorization frameworks for e.g. UNSPSC, SIC etc.
    b. All stakeholders talk about the same structure thus making adoption easy
    c. Data standards are incorporated into applications using the data
    d. Products and service catalogues adoption can be increased using a standard framework of data
  3. Data Processes
    a. Efficient processes and frameworks in place for data maintenance.
    b. Common master data processes are used across the organization
    c. Defined roles and responsibilites for all stakeholders
    d. Well doucmented processes and guidelines that can be communicated to the end user group across all business units
  4. Data Integrity
    a. Data security policies and procedures are defined and aligned with business needs and outcomes
    b. Regular security audits are conducted to maintain data integrity
    c. Defined roles and responsibilities between creators and maintenance of master data and transactional data
  5. Data Engineering
    a. Innovative tools are used to manage the large volume of data across the organization
    b. Data flows across the enterprise application architecture is documented, understood and available for use
    c. A robust data architecture enables increased flexibility, speed and agility of the enterprise
  6. Data Governance
    a. Clear governance structure following a top to bottom approach
    b. Define a centralized group for end to end management of master data
    c. Well defined training plan for end users at all levels and all format of data (item master, vendor master etc.)

Master data management is not a new problem, however, with the introduction of compliance regulations like Sarbanes-Oxley in US and Basel II in Europe and organization's increasing interests in having a performance driven data framework, have given it a new kick start. Unstructured data has to be tagged so that organizations can make sense of it and they start connecting the dots between the unrelated pieces of unstructured and structured data. While companies will be spending billions to renovate enterprise applications through service oriented architecture, they also need to understand that critical master data management capabilities and processes are required to produce data and information of high quality and consistency to make all enterprise applications work together. Without these capabilities and processes, the investments will not enable the best business decisions and deliver maximum benefits and business value.

 

Related Reading: White paper on Realizing the Business Value of Master Data Management

June 5, 2014

Infosys Process Progression Model is addressing clients' expectations on Transformation

PPM's strength lies in defining the transformation roadmap aligned with client's core long-term business strategy and providing detailed levers for its realization.


What clients expect from transformation

I have seen a tremendous drift in the expectations of the European clients towards outsourcing and overall expectations on transformation and business partnering. Clients expect service providers to be their transformation partners so that they can focus on core business.

We as business process providers have ambition to deliver value to our clients which has impact on their end customer. To many of our European clients, we have realized value as financial result e.g. improvement of sales, cash flow, reduced costs of operations or it can have impact on speed and quality of services. The qualitative measures will be indicated by improved customer satisfaction or market perception.

BPO organization acting as transformation partner should be able to impact business decisions of the customers. One of the biggest opportunities, in this side of the continent, is the current trend on Big Data analysis. This is a mandate of BPO organizations to use all possible sources of data to identify opportunities for improvement or more detailed business insight.

PPM's role in addressing transformation

Infosys PPM framework addresses the transformation needs of our clients. This is our commitment that we will deliver the business value to our customers through our partnership contract. We realize that only this can ensure the true satisfaction and long lasting relationship. PPM is helping us guide this transformation journey. It is methodological and transparent framework, which directs us and our clients to develop a customized transformation plan for each engagement and enables us to use the right transformation lever at the right stage.

We treat the P1 stage of PPM (Noiseless Process) as the must-have element, which in itself is the pre-requisite for further progression. Our aim is to enhance this most basic period of a noiseless delivery by putting operations into auto-pilot mode and with saved efforts focus on improvements and transformation.

From the early stage of relationship we will try to identify process gaps and opportunities for maturity increase. Infosys has a suit of methodologies that is used for assessment of the current state as well as driving the change. The whole Change Framework leveraged in PPM model ensures change is properly planned and executed, with clear and transparent progress monitoring element. Infosys makes continuous significant investments in order to ensure all the most advanced developments are used for transformation of processes of our clients. We ensure that within five core themes of PPM (Service Delivery, People, Knowledge Management, Technology, Risk and Compliance) we propose best-in-class and up-to-date practices and solutions to achieve the best value for money from outsourced operations.

Apart from leveraging best practices, harmonization techniques, automation possibilities and Lean and Six Sigma approach to bring process efficiency, we deep dive into data coming from different sources and do advanced analytics. To add value, we will provide market intelligence, domain trends and in-depth market research that would impact business strategy of our clients. The internal and external benchmarking is used to document the tangible benefits and impact of changes on the clients' operating and financial metrics.

As described above, PPM's strength lies in defining the transformation roadmap aligned with client's core long-term business strategy and providing detailed levers for its realization, enabling for progress monitoring and value delivery.

Related Reading: Learn more about the PPM Model.

June 3, 2014

Pre or Post Spend Classification??

Many organizations do their spend classification periodically after the spend has been concluded. However, an analysis of the Pre Vs. Post classification for the RoI will show if the game is on or not...


We have heard the old adage "a stitch in time saves nine". When we talk about Spend Visibility that helps us to enable make an effective supply management strategy, it is often observed that many organizations do their spend classification periodically after the spend has been concluded. We have the technology, methodology or taxonomy to know the category classification before the transaction.

Consider two scenarios. In the first, we choose to collate the spend data and start the classification process after the fact. As an illustration, let us consider the data till 20th of each month is the cutoff. Data of 120,000 lines till the cutoff date is considered for classification over a period during the next month and the same is made available in the subsequent month for the spend cubes. This would mean that our spend estimates are off by over a month.

Now under the second scenario, let us consider to classify the data immediately either at the time the Purchase Requisition is raised or when the payment for the supplier is requested through the Non PO. Then the cycle time of such a classification through an algorithm could be take about few minutes additionally if the technology is placed. Alternatively, we can also choose to classify the data in batches based on the volume of transactions in each period and complete the classification exercise as soon as the request is placed. In these cases, the transaction value can be considered for spend visibility earlier than the first scenario.

The key deciding factor shall be the business value add this will provide through an improved 'expected value of perfect information'
a)  when we consider opportunity assessment, leverage spend during negotiations
b)  when it helps to manage a better working capital reflecting the potential cash flow for each category

An analysis of the Pre Vs. Post classification for the RoI will show if the game is on or not...

Subscribe to this blog's feed

Follow us on

Blogger Profiles

Tweets by @Infosys_BPO