This blog is for sourcing and procurement professionals to discuss and share perspectives, point of views, ideas and best practices around key sourcing and procurement topics.

« April 2010 | Main | March 2013 »

August 22, 2012

Quest for an Optimum Commercial Model for S&P Platforms

Am I paying the right price?

In a world where labor cost arbitrage has become commoditized, companies are progressively looking at technological transformations to help streamline business processes while keeping an eagle's eye on managing risks.

Business Platforms provide businesses with the most prominent dual benefit:

·         Man power reduction

·         Better quality through automation and hence through minimal human intervention


Business Platforms are expected to bring in greater compliance and thus help the management team in better governance by allowing them to concentrate on the core business issues.

The big question now is would all of this come at an affordable cost? An offshoot of this question is, does technology really contribute meaningfully to the bottom-line?

To answer the above questions, multiple models have been propounded by various market participants.

Some of the prevalent models are:

OpEx model / On Demand / Pay-as-you-go model - This is the most popular model in the industry today. Clients are charged based on the usage.

E.g.: Charge based on number of professional users for Sourcing and based on percentage of Spend for Procurement

CapEx model / Hosted model / On Premise model: Some companies choose to go this route when they're on the lookout for heavy customizations to suit the specific requirements of their organization.

Fixed Price model: A price or Total Contract Value (TCV) would be frozen for deal term leaving no scope for fluctuations in the price. Thus, clients would have a clear view of the projected cash flow for the complete term of the contract

Some of the other prominent models are:

·         Metered Pricing

·         Perpetual Licensing

·         Term licensing and AMCs

·         Self-funded deals

·         Co-creation

 Let's look at what these models should aim to do.

Avoiding sub-zero cash flow in Year 1:

The expectation of a CFO from any project or an activity is to ensure that the net flow of cash is positive for all the years with this rule applying particularly to the first year of the contract. Cash flow in the first year suffers generally due to bulky investments necessitated by the initiation a transformational project.

An ideal platform Pricing should hence cater to reducing the capital expenditure by

1.     Offering to charge on a transactional basis or charge based on usage.

2.     Amortizing the one-time cost over the deal term

3.     Offering to adjust the payment with productivity benefits, etc.

Flexibility in dealing with volume fluctuations

The pricing proposal should offer a fair degree of flexibility in dealing with fluctuations in volumes to avoid constantly revisiting the pricing proposal

Some of the prominent ways to counter this are to have:

1.     A tiered pricing model where different price point is offered for various tiers of volume

2.     Additional Resource Charge (ARC) and Reduced Resource (RRC) Credit methodology.

3.     Minimum volume commitment

Payouts linked to Milestones and SLAs

There are generally two main phases in the life of any Platform deal.

·         Deployment or Transition Phase

·         Steady State Phase

Milestone based payments especially during deployment/transition is an optimum model as the payment can be linked to the vendors delivering in accordance with a predetermined plan. Thus delays in delivery or non-compliance to the deadlines are bound to attract penalties. This ensures superior quality of service which is bound by time.

For the Steady State Phase, creation of an 'At Risk' amount with an overall pool percentage and metric-wise pool percentages goes a long way towards ensuring quality delivery within the specified time limits

Productivity Benefits and Gain Share Model

Since a Platform offering is a packaged solution of technology and services, productivity benefits and gain share models can be applied on the services part.

There are various ways of calculating productivity benefits and Gain share options and these calculations are scenario and company specific.


While there might be other factors that affect profitability, above are the most important aspects and hence should be given utmost priority during commercial negotiations.

Let's look at each of these aspects in detail in the subsequent posts.

August 13, 2012

Semantic Source to Pay System - The Holistic Approach to Enterprise Software

The foundation of data model is one of the most critical factors contributing to the success of any enterprise software system. A well define and extendable data model will greatly improve the evolution and maintainability of the system.


The most prevalent technology for the enterprise software data design is the combination of the RDMS for the under hood data model and XML for the integration. Besides all the advantage of relational DB scheme and XML schema, two of the biggest problems come with such models are:

1). Tight coupling of data model with the application.

2).tremendous data redundancy for storage and integration.

Thinking about the typical Source to Payment software suite, the data model of the contract lifecycle management (CLM) application is so coupled with the application itself that it could never be used for the procurement application even though most of the procurement applications need the exactly same contract data for their business process management process. Many times a redundant and similar table structure is duplicated into the procurement application. To make the things worse, a sync-up process between two will produce more redundancy and waste of resources. Another example would be the transactional business data: in each purchase order, change order, order response and invoice, the buying/selling/billing parties' address, communication means and identifier are repeatedly transported and stored between procurement application and backend ERP system even though 99% time they are purely duplicate and redundant data. The root of the problem is exactly I pointed out above that the rigid data model and schema are forcing data itself unfortunately intertwined with the process (application).

Someone would say, while I get your point but that is the price we have to pay to tradeoff the flexibility with the efficiency. The answer is yes and no; yes, that was the price a system had to pay when there was no mature technology to replace the intrinsic rigid nature of relational data model and schema; no, fortunately nowadays we are a "secrete" weapon in our arsenal to tackle the above problems of undesired coupling between data and application and unnecessary data redundancy between applications.

That holy grain is the semantic data approach. The semantic model is a simplistic but very powerful way to present the data; it comes with three parts subject-verb-object. Virtually all data and relationship would be deduced to these three elements: Infosys has a master agreement A; the master agreement A is for Secure ID; the Secure ID is produced by RSA; RSA is a US company; RSA HC's address is in MA; RSA's plant address is in TX; the PO has a number of 123; PO number 123 contains 100 RSA secure ID.

As you can see that any complex relationship or transactional data could be bisected into semantic statements which frees up the data model completely from the application and an advance infer engine with the properly defined ontology would understand and harvest the information from the semantic raw data by taking advantage of powerful 64-bit O.S. and in-memory DB, the true power of semantic data anchored applications are finally ready for the prime time. Taking enterprise Source to payment  domain, imagine a common data store back by the semantic data model (RDF and S2P ontology) there will be no extra data transformation and integration between spend analytic and procurement application since all the data is stored in the semantic form and spend analytic application just need to refer to the S2P ontology and pre-defined rules to extract and analyze; the only information for the flow of contract from strategic sourcing to procurement is the master agreement/contract ID, the procurement system will be intelligently enough to extract the related contract information using the semantic data on the fly when it is necessary during its workflow process; finally there will be no more redundant data in PO, CO, POR, INV and GR passing around, what any application need now is the true delta; with the help of the infer engine and ontology the application would extract precise information as required for its process Just-In-Time.

It is not only the tremendous efficiency and resources saving would be exponential comparing to the current mainstream systems but also the foundation of the computing system which is much more close to the way human brain store and handle information would be established and prepared for the next revolution.  Combining the big data with semantic data model smarter enterprise is finally not maybe but INDEED J                            



Subscribe to this blog's feed

+1 and Like

Follow us on

Infosys Edge on Twitter