Infosys’ blog on industry solutions, trends, business process transformation and global implementation in Oracle.


November 9, 2018

Importance of CPQ for Industrial Manufacturing


Most Industrial Manufacturing organizations are dealing with constantly changing market dynamics & variables that make configuring accurate sales quotes a complex process. Existing pricing and quoting tools & processes across industries are disparate, stand-alone custom applications with repeated manual data entry in sheets and formula-based price estimates resulting in data duplication, inconsistent BOMs, inaccurate estimates with no historical insights, reporting & analytics. This repetitive, manual and tedious work for Sales to capture product requirements, price it right and present to their customers and often takes weeks & months before converting to order which results in lost opportunities and impacts sales efficiency.

A robust Configure-Price-Quote(CPQ) system is thus essential in today's dynamic and competitive markets in order to provide faster and consistently accurate estimates to your customers and improve Sales efficiency by automating the quoting process to capture and process all product configurations and pricing details in a single system.

The most important cog wheels to achieve a unified and harmonized CPQ process are:

  • Configurator: Industries today want to cater to specific business needs of their customers and thus need to provide their Sales with highly configurable products to generate unique sales BOM that can serve the purpose effectively. The configurator must be able to support complex catalog hierarchies and product selections in a guided flow bound by various rules to ensure accurate sales BOM is generated automatically. The configurator must also support logical bundling of products & associated components & services to allow different selling models to be readily adopted by Sales, distributors, resellers and partners.
  • Price Engine: A competitive and dynamic pricing strategy with pre-defined rules based on product attributes, service plans, price contracts or agreements etc. will improve the win probability and provide edge over market competition. It also ensures that discounts & margins are in control and go through systematic reviews and approvals to create accurate and winning quotes to end customers.
  • Quote Life-cycle management: A complete opportunity-quote-order (Quote to Cash) process to manage and track quotes or contracts all the way to order creation and fulfillment through ERP systems and is fully integrated end-to-end with an upstream CRM system for opportunity management. The system must be able to automatically generate quotes from pre-defined dynamic templates that can be tailored to present the quote information according to the currencies & languages preferences across different regions of the world.

Benefits of CPQ for Industrial Manufacturers

  • Improved Time To Market: Product catalogs and bundles are constantly changing keep with customer demands. CPQ enables you to logically organize, expand and update product catalogs, complex configuration & pricing rules ensuring your sales teams are always building accurate proposals and closing deals in time.
  • Multiple Sales Channels: CPQ creates more valuable customer experience through guided selling and dynamic pricing and also brings internal sales teams, partners, distributors and resellers onto a common selling platform improving margins by identifying more upsell and cross-sell opportunities.
  • Improved Sales Efficiency: CPQ equips and enables Sales teams with product knowledge and engineering expertise thus eliminate reworks and other costs associated with an inability to deliver as promised and also eliminates the need to manually enter data into multiple systems frees sales reps, allowing them to concentrate more on selling products and services to customers.
  • Maximize revenues with Improved margins : CPQ ensures optimal pricing and allows you to better manage promotions and discounts achieving maximum revenues and margins on every deal. It also provides easy and faster buying experience for your customers with less turn-around times and improved sales velocity.

In digital era, where customer experience (CX) is top-notch priority for industrial manufacturing organizations, CPQ technology is a must-have to drive sales productivity and customer engagement, while reducing costs to edge ahead of competition in the disruptive markets.


Other useful links:

Continue reading " Importance of CPQ for Industrial Manufacturing " »

August 27, 2018

Oracle Cloud R13- One Time Payment Request

Oracle Fusion Financials Cloud R13 offers functionality to import invoice and payment details as Payables Payment Request from external systems using a predefined FBDI template

Supplier for the request

ü  Not an existing supplier in Oracle

ü  Entered as a Party

ü  Cannot be queried at Manage Supplier page

ü  Cannot be reused for standard invoice

ü  Bank Details (Account Number, Bank, Branch) are required to be entered in the import data

FBDI Template - Payables Payment Request Import

ü  The 18A Template can be downloaded from below link

Payables Payment Request Import

ü   Key Template Columns 

Column Name


Transaction Identifier

Invoice identifier to uniquely identify the OTP request

Party Name

Name of the Supplier/Party

Party Original System Reference

Reference information of party from the source system

Party Type

Supplier Type.

Party type can only be Person, Organization, Group or Relationship.

Location Original System Reference

Source System Reference for location.


Country of Party Address

Address Line 1, 2, 3, 4

Address Line 1, 2, 3, 4 of Party Address

City, State, Postal Code

City, State, Postal Code of Party Address

Supplier Bank Details

Account Country, Currency, Account#, Type, Bank & Branch #

Business Unit & LE

Business Unit and Legal Entity


One Time Payments (This is defined at Payables Lookup, Type= Source)

Invoice Details

Invoice Number, Date, Currency, Description, Paygroup, Payment Term

Payment Method

Mode of Payment

Liability Combination

Liability Account Code Combination.

If left blank, value of this field defaults from the setup.


Mandatory Setups

1.       Enable One Time Supplier feature for the instance (View image)

2.       Add the OTP Code as Payables Source (View image)

3.       Add the source at Trading Community Source System (View image)

4.       Enter default Liability and Expense accounts at 'Manage Common Options for Payables and Procurement' (View image)

5.       Enter default location at 'Manage Business Unit' (View image)

Creating Payment Request Invoice and Payment

1.       Prepare the FBDI template with Payment Process Request data (View image)

2.       Generate csv and zip files from template (View image)

3.       Upload the zip to UCM Server. Account- fin/payables/import (View image)

4.       Run 'Load Interface File for Import' process to load the data to interface table (View image)

5.       Run 'Import Payables Payment Request' process. Source: OTP (View image)

6.       Invoice created (View image)

7.       Payment can be made by selecting 'Pay in Full' from Invoice Action or Creating a new Payment at Manage Payments.

 1099 Reporting for Payables Payment Requests

1099 reporting is not supported for One Time Payments. Assumption is that the source application generating one-time payments would handle any tax requirements.  If payments handled within Oracle Cloud Financials require 1099 then the supplier needs to be created in Oracle and paid by invoice.


August 6, 2018

Oracle Data Visualization (DVD/DVCS) Implementation for Advanced Analytics and Machine Learning

Oracle Data Visualization Desktop(DVD) or Cloud Server(DVCS) is a very intuitive tool, which helps every business user in the organization to create quick and effective analytics very easily. People at all level can leverage the benefit of blending and analysing data in just a few clicks and help the organization to take informed decision using actionable insights. Oracle DVD is a Tableau like interactive tool which helps to create analysis on-the-fly using any type data from any platform, be it on premise or Cloud. Main benefits of Oracle DVDs are below:

·         A personal single user desktop tool, or a SAAS cloud service, which can be leveraged by any business user in the Organization.

·         Enable the desktop user to work even offline

·         Completely private analysis of heterogeneous data

·         Business user can have entire control over the dataset/connections

·         Direct access to on premise or cloud data sources

·         Administration task has been removed completely

·         No concept of remote server infrastructure

Oracle DVD/DVCS enables the business user to perform analysis using traditional methodologies as well as provides capability to perform Advance Analytics using R and creating Predictive model using Machine Learning algorithm using Python.

This simple and intuitive tool provides a very unique way to enable you to perform Advance analytics by just installing all the required packages. DVML (Data Visualization Machine Learning library) is the tool to help you install all the required packages for implementing machine learning algorithm for predictive analysis in one go.

Install Advance Analytics(R) utility will help you to install all the required R packages to perform Advanced Analytics functions like Regression, Clustering, Trend line etc. However, to run both the utility in your personal system/server, you need administrative access as well as access to internet and permission to automatically download all the required packages.

In the below slides we are going to discuss, how to leverage Advance analytics and machine learning functions to provide predictive analytics for the organization.

In order to create a Trend line graph, we need to enable Advanced Analytics and then pull required column into the Analysis.

Trend line Function: This function takes 3 parameters to visualize the data in a trending format.

Syntax: TRENDLINE(numeric_expr, ([series]) BY ([partitionBy]), model_type, result_type)

Example : TRENDLINE(revenue, (calendar_year, calendar_quarter, calendar_month) BY (product), 'LINEAR', 'VALUE')

We need to create various canvases and put them into one story line by providing corresponding description over the canvas. While creating Trend line visualization, we need to provide the Confidence level of data. By default, it will take 95% confidence level, which means the analysis will be performed over the 95% of data.

Continue reading " Oracle Data Visualization (DVD/DVCS) Implementation for Advanced Analytics and Machine Learning " »

April 28, 2018

Platformization at Telcos

Telecom service providers operate in a highly competitive environment. There has been continuous erosion of profitability around their core services related to connectivity and content. Platformization is a means for the Telcos to capitalize on new revenue generation opportunities by leveraging their networks, infrastructure, IT systems and customer data.

Continue reading " Platformization at Telcos " »

April 17, 2018

Enabling CA Signed Certificate in Oracle JCS and On-Premise Weblogic

Enabling CA Signed Certificate in Oracle JCS and On-Premises WebLogic (How To Series)

Tools: KeyTool, Openssl (Optional)


By default Oracle JCS server has self-signed certificate based SSL/TLS. For enhanced security and trust, we have to use CA signed certificates.  This document can be used for both On-premises WebLogic servers and Oracle JCS based Weblogic servers. Implementing CA signed certificate can prevent hacking attack like man-in-the-middle. Using CA signed certificates internal and external communications between services can be secured. Also environment access can be secured.

Key Features covered in Document

1) Brief about CA signed certificates and how chain of certificates are maintained.

2) How to implement chain of CA sign certificate on WebLogic admin and managed server.

Continue reading " Enabling CA Signed Certificate in Oracle JCS and On-Premise Weblogic " »

March 31, 2018

Blockchain & Finance - An Introduction for the CFO

Have you heard about blockchain? Even if you have not heard about blockchain, you would surely have heard about bitcoin.  Bitcoins are not blockchain but Bitcoins use the blockchain technology.

Why should a CFO concern about blockchain technology?

The blockchain technology is a big game changer.  It can be used to solve many business problems. While some industries are hugely impacted, others might have minor impact. Also, since the technology is evolving and maturing new impacts are getting discovered every day. Ignoring the technology could mean loss of competitive advantage, inefficient process impacting shareholder value. As the guardian of the shareholder value, it is of great importance to the CFO to understand the technology in general and impact on finance function in particular.

Before, we discuss how the blockchain impacts the finance function, let us understand what blockchain is, what its unique features are, what are its benefits.

What's the name?

The blockchain technology is also sometime referred to as DLT i.e. Distributed Ledger technology. While there are minor differences between the two, to keep things simple, we can assume both are the same.

What is blockchain / DLT (Distributed Ledger Technology)?

As the name indicates the technology uses blocks, chains, is distributed (i.e. decentralized) and ledgers (list of data). Basically DLT uses blocks to store data, the data is linked / chained to each other most likely using cryptology.  Apart from data storage / linkage, in DLT the complete data will be replicated (distributed). The data in the block chain is stored based on a 'consensus' rule and blockchain might also have smart contracts, which gets executed based on certain criteria.

Blockchain / DLT (Distributed Ledger Technology) - How does it help?

Because of the above characteristics, a blockchain can help businesses

  • Speed up business processes - transactions taking days can be done in seconds.

  • Reduce costs - as it will enable direct peer-to-peer interaction without the need for intermediaries.

  • Reduces risks - as the transactions are immutable and cannot be changed ones created

  • Enforces and builds trust - all data is transparent and additions are through a consensus mechanism.

Maybe the above discussions are very technical, let me describe a finance use case for better understanding of the technology and the benefits. 

Trade Finance - Use Case - Using Oracle Cloud, Oracle Blockchain Cloud Service

Trade finance is one of the areas where the blockchain technology is already in use. Let us imagine a typical bill discounting scenario.  The scenario will have the following participants - buyer (say 'ABC Electronics'), seller (say 'LG Electronics'), and financing bank (say HSBC).  Assume we are the buyers, using Oracle Cloud applications.

ABC Electronics buys the goods from the LG, on receipt of the goods and the invoice from the LG, the details are sent (physical copies of invoice) to the HSBC bank. HSBC bank verifies the data and then releases funds to the LG based on the due date.

Note the above process

  • Might take 3-5 days, probably more

  • The participants to the process, do not have a visibility of the status - Are the goods received by the ABC Electronics, is the invoice received by the ABC Electronics, has HSBC bank got the document, has HSBC bank verified the documents.

  • The invoices might get damaged, lost, tampered with - as they move between the different parties.

How can Oracle Blockchain Cloud Service help here-

With blockchain we can now build a solution whereby

  • The business process of sending goods, receiving goods, receiving invoices, sending invoices to the buyer, verification of receipts and invoices by the buyer, sending the invoice to the bank can be captured / shared  on the blockchain

  • The transactions on consensus gets added to the block chain and cannot be tampered with (immutable)

  • Additions to the blockchain can be done by automatic process / manual process. Oracle Blockchain Cloud Service offers REST API's to automatically integrate the Oracle cloud applications with Oracle Blockchain Cloud Service.

  • New data can be added based on an agreed consensus mechanism, which can be built using Oracle Blockchain Cloud Service.

  • Oracle Blockchain Cloud Service also offers a front end application, which help the participants to view the status of the transactions (data transparency)

  • The physical invoices need not be sent to the bank, the bank can directly connect via RESTAPI offered by Oracle Blockchain Cloud Service, to verify the invoices captured by the buyer. ( eases and speeds up the process)

  • With Oracle Blockchain Cloud Service, a smart contracts can be built to automatic transfer amounts to the seller, on due verification of the invoices (process automation)

Below is the pictorial representation of how data (block) gets added to each node after each business event based on consensus between all participants and the same view is available to all participants.

With the above solution

  • The data is visible to all participants and is consistent across all participants.

  • Physical invoices need not be sent to the bank.

  • The correct invoice details are confirmed by all parties and cannot be tampered with (immutable). The ability is only possible due to the use of blockchain technology.

  • Smart contracts executed automatically to initiate supplier payments.

  • The time to process the payment to the seller can be done in few minutes instead of days

Are there other Use cases - Impacts on finance function?

While there is a big impact on financial services industry, crypto-currencies, the focus of this note is to discuss the impact on the finance function perspective, at a more micro level.

There are many other use cases. As the technology matures, the way it is implemented is also evolving and new use cases are getting discovered.

Oracle (in Oracle Open World 2017) while releasing the Blockchain Cloud Service solution, have listed a good set of questions which will help you determine the possible use cases for blockchain. Businesses need to check on below to discover potential use cases

  • Is my business process pre-dominantly cross departmental / cross organizational? ( think of intercompany reconciliation, interparty reconciliations)

  • Is there a trust issue among transacting parties? ( think of trade finance scenarios)

  • Does it involve intermediaries, possibly corruptible?

  • Does it require period reconciliations? ( think of intercompany reconciliation, interparty reconciliations)

  • Is there a need to improve traceability or audit trails? (think of bank confirmation letters, third party balance confirmation letters needed by auditors)

  • Do we need real time visibility of the current state of transactions? (think of publishing reports to various stakeholders)

  • Can I improve the business process by automating certain steps in it? (think of automatic payment, based on inspections by a third party).

From above, we can see numerous opportunities for improving the finance functions. Let me try to list possible use cases by critical functions of finance.

S Num



Possible impacts


Financial Management


Ø  Strategic Planning

Ø  Annual Planning

Ø  Rolling Forecasting (Quarterly / Monthly)

Ø  Working Capital management

Ø  Forex management

An internal, permissioned blockchain can be built to get consensus on the plan, which is transparent to all participants and immutable.


A permissioned blockchain can be setup to speed up the funds disbursement process for trade finance


Financial Reporting and Analysis


Ø  Statutory and External Reporting (GAAP / IFRS / VAT etc.)

Ø  Management Reporting (Scorecard, Dashboard)

Ø  Strategic Finance (Scenario Planning. M&A)

Ø  Customer and Product Profitability Analysis

Ø  Balance Sheet, P&L ,Cashflows

A permissioned blockchain can be setup for secured communication of reports which is secured, tamperproof, quick to publish.



Governance, Risk and Compliance


Ø  Financial Policies & Procedures (Business Rules Management)

Ø  Tax Strategies and Compliance

Ø  Tax  Accounting

Ø  Audit, Controls and SOX Compliance

Ø  Enterprise and Operational Risk Management

Ø  System Security and Controls

Secured communication of reports to government authorities.


A permissioned blockchain can be built to get consensus on the account balances for audit purposes.


Finance Transactions and Operations


Ø  General Accounting

Ø  Managerial Accounting

Ø  Accounts Payable

Ø  Credit and Collections

A permissioned blockchain can be built which is transparent, immutable and consensus based to capture customer promises for cash collections.


Financial Consolidation


Ø  Period end Book closure (monthly, quarterly, yearly)

Ø  Currency translation and trial balances

Ø  INTRA and INTER company transaction accounting

Ø  System of records close ( COA,  GL, Sub-ledgers)

A permissioned blockchain can be built to share and agree on intercompany balances.


Any pitfalls? What should you check?

There are many potential uses of this technology. As the technology matures and more Proof of concept projects get executed, new use cases are getting discovered and old use cases are also getting dropped.  As per Gartner Hype cycle, blockchain technology has passed the 'Peak of Inflated expectation' phase and is likely to enter in the 'Trough of Disillusionment' phase as POC's start failing before entering the 'Slope of entitlement' phase.

Considering the hype, there is a risk of trying to force-fit blockchain in scenarios, where simpler, cheaper, faster options might work better. While blockchain are immutable, highly secure, there are few exceptions and special attention is needed to ensure the exceptions are understood and managed. The government regulation to manage blockchain contracts also need to be evolve. There are also concerns with data transparency, which might not always be a good thing.


Blockchain is a big game changer.  Its impact on the finance function is inevitable. As the technology matures, the technology will help the CFO automate, speedup processes, build internal controls even with third parties outside the organization.  The CFO organization should start discussion on discovering use cases. It is likely that new ways of doing processes might be developed, in a way never imagined before.

The intention of the article is to give an introduction to blockchain, the impact on finance function and how Oracle Blockchain Cloud Service can help with build a block chain quickly.

Continue reading " Blockchain & Finance - An Introduction for the CFO " »

March 15, 2018

POV: FDMEE vs Cloud Data Management




As more organizations begin to embrace the Oracle software as a service (SaaS) of Enterprise Performance Management (EPM) Cloud offerings, there is an often overlooked but important decision that needs to be made early in the adoption cycle - what toolset will be used to integrate data into the EPM Cloud Service products such as Planning and Budgeting Cloud Service (PBCS), Financial Close and Consolidation Cloud Service (FCCS), or Profitability and Cost Management Cloud Service (PCMCS).


                        Herein we are going to explore the two primary data integration options that are available to customers and the pros and cons of each. The conclusion provides a recommendation that can be applied to organizations of all industries and sizes as they plan their journey into the Cloud.


The Encounter


Oracle continues to grow its Cloud service offerings both in terms of customer volume and functionality. The changing landscape of software and infrastructure has facilitated a number of organizations to adopt a cloud strategy for one or more business processes. We cannot refute the aids of the Cloud, as due to this software is regularly improved, the hardware is possessed and preserved by Oracle, and application upgrades are shown as a historical thing. While the shift to the Cloud is a broad topic with many considerations, herein our focus is the data integration toolset, and more broadly, the integration strategy.


                        When customers shift to Cloud, they are repeatedly educated that the Cloud service contains a section named Cloud Data Management which can address all of the data integration requirements for the Cloud service. Honestly, this is an overly optimistic view of the capabilities of Cloud Data Management. Data integration requirements can drive solutions that range from very simple to incredibly complex, and this large spectrum demands a more holistic assessment of the integration options. It is impracticable to assess the thorough solution necessities in a software sales cycle, the important query that every organization should have when bearing in mind their data integration plan for EPM Cloud Services is - what are my options?

Cloud Service Data Integration Options

As with any software offering, there are numerous potential solutions to a given requirement. While assessing software choices, options are normally gathered into two classes - buy versus build. A 'buy' choice is acquiring a packaged software offering. The Oracle EPM Cloud Services are an instance of a buy decision. In addition to prebuilt functionality, an important advantage of a packaged subscription is maintenance for the solution including future version releases. A build resolution means making a routine solution which is explicit to a single association. The last one is usually unsubstantiated by a software merchant, and range of capabilities and advancement are both subject to the skillset of the individual or team that developed the solution.

                        Herein we focus on embalmed resolutions as those are more thoroughly line up with the often-expressed goal of adopting a Cloud approach that is streamlining the solution and its possession. The choices such as ODI or the rest API are all valid, these are considered build options in the build versus buy decision and thereby are excluded from this analysis.

Considering packaged submissions for integration with Oracle EPM Cloud Services, the two main options available to customers are FDMEE and Data Management. FDMEE is a separate on-premises solution whereas Data Management is a combined component within each of the Oracle EPM Cloud Services. Before comparing these products, it is necessary to highlight the purpose and capabilities of each.


Financial Data Quality Management, Enterprise Edition (FDMEE) is a purpose-built application for integrating data into the Oracle EPM suite of products. This solicitation comprises predefined reasoning for loading data to the on-premises EPM applications of HFM, ARM, Hyperion Planning, and Essbase. Additionally, FDMEE (as of the release) can integrate data directly to the EPM Cloud Services of FCCS, PBCS, PCMCS, and Enterprise Planning and Budgeting Cloud Service (EPBCS).

                         FDMEE as an ETL: Financial Data Quality Management Enterprise Edition can lightly be mentioned as an ETL-type application. The ETL is incorporation procedure for extracting, transforming and loading data. FDMEE is not a true ETL tool because it is not intended to handle the extremely large volumes of data (millions of records in a single execution). For handling large volumes of data, a pure ETL tool such as Informatica or ODI would theoretically be a better fit. Financial Data Quality Management Enterprise Edition offers many core functionalities as ETL tools. It has the capability to extract data from a diversity of sources, transform the data to the EPM dimensionality, and load the resultant data to EPM applications.

                         FDMEE is different than pure ETL because FDMEE was designed keeping business user in mind. ETL solutions are generally owned and operated by the IT department. ETL executions are scheduled, and any deviation from the defined process or timeline often requires coordination between the business user requesting the off-cycle execution and the IT owner of the ETL solution. The Financial Data Quality Management Enterprise Edition is regularly managed and preserved by business users. The FDMEE operators have the capability to apprise alteration reasoning through a web interface with tiny to no coding awareness required. Users can schedule FDMEE jobs or execute them in an ad-hoc fashion as data is needed. The end-to-end process is completely within the hands of the business users.

                        FDMEE adaptors: The Financial Data Quality Management Enterprise Edition offers strong and authoritative extract abilities comprising prebuilt adaptors to source data from Oracle EBS GL, PeopleSoft GL, HANA, J.D. Edwards Enterprise One GL, SAP GL, HCM and Business Warehouse. All these adaptors offer the reasoning and coding required to source data and remove the need for organizations to define and maintain custom extract queries. This is a significant value-add of FDMEE. Additionally, Financial Data Quality Management Enterprise Edition can source data from any relational source as well as any flat file format. These three methods, 'prebuilt connecters, relational association, and flat files' guarantee that Financial Data Quality Management Enterprise Edition is capable to utilize closely any data source required to support the EPM systems to which it is intended to load data.

                        Data Conversion in FDMEE: The conversion proficiencies, which is identified as mapping are another main aspect for FDMEE. Often the transformation that occurs during a standard ETL process is accomplished through SQL queries that must be designed from scratch. FDMEE uses SQL to execute conversion in the background, the mapping logic is input in a web interface that looks and feels very much like an Excel worksheet. Source system values are aligned to target system values in a columnar grid format. Financial Data Quality Management Enterprise Edition maps provision multiple mapping methods including Explicit (One to One), Between (continuous range mapping to a single value), In (non-continuous range mapping to a single value), Like, and multi-dimensional maps where numerous source system sections are used to find an EPM target dimension value.

                       Data Loading in FDMEE: FDMEE further differentiates itself from standard ETL solutions in its load processes. Load procedure is resolution-built for incorporation into the Oracle EPM product suite. Not only does this ensure a seamless load of data to the target EPM application, but also it comprises prebuilt logic that improves the data load process. For example, without any additional build effort, Financial Data Quality Management Enterprise Edition can perform calculations in the target EPM application to perform tasks such as currency translation, data clearing or aggregation. While standard ETL tools can certainly achieve this, Financial Data Quality Management Enterprise Edition offers this proficiency natively and needs no further build effort outside of organizing the integration to perform these actions.

 FDMEE Value- Add Landscapes

                        Audit Controls in FDMEE: Because FDMEE stores its transformation logic within the application, users are able to investigate the data conversion that was applied, to better recognize how a source system data point was converted to a target system intersection. FDMEE has the capability to track changes to the renovation logic. FDMEE can track who the was the user and when he or she altered the transformation logic. FDMEE application tracks the conversion logic before and after the change so the effect of the change is understood. Finally, FDMEE provides a tremendous amount of activity-based logging. FDMEE application captures each accomplishment of the ETL (the Workflow) process and captures thorough information such as the user performing the process, start and end times, and in-depth technical actions that allow for not only debugging but also for performance and process tracking. Over and over again internal or external auditors enquire for proof to support, that the data in a reporting application is latest, precise, and comprehensive. As the data is regularly transmuted throughout an ETL process, the preconfigured Financial Data Quality Management Enterprise Edition reports and user interface capabilities can be used to easily validate the transformation effect. As well, a number of reports are available to audit the overall process execution -when it was run and by whom. These powerful tools can be used to demonstrate the rationality of data inside the EPM application.

                        Drill options in FDMEE: Financial Data Quality Management Enterprise Edition provides functionality known as drill back and drill through.

                        Drill back is the action of moving from the EPM application to the FDMEE application to investigate the source records that make up the balance from which the drill back was initiated. Drill back is native to any EPM system to which FDMEE loaded data. The main requirement to this functionality is that the drill back should definitely be originated from an input level intersection to which data was loaded. It specifies that drill back cannot occur from parent levels within any of the hierarchies. This is undoubtedly a zone where the community would like to see Financial Data Quality Management Enterprise Edition drill back developed. Appropriate training of the process of drill down and then drill back can regularly overcome this apparent limitation.

                       Drill through, by contrast to drill back, is not native to each source system from which FDMEE can extract data. Drill through is provided natively with the preconfigured adaptors to the Oracle branded GLs as well as SAP R3 and ECC. For non-Oracle or non-SAP data sources, the drill through is dependent on the capabilities of the source system. Financial Data Quality Management Enterprise Edition drives a drill through request in the form of a 'http' request. The ability to drill through to the source system is thereby depends on the source system having a handler for the web request. Any system that can accept the web request could in-theory be configured to support drill through from FDMEE.

Cloud Data Management

Cloud Data Management is intended to allow an organization to adopt a pure Cloud solution for Oracle EPM deployments. Cloud Data Management is a module within the Oracle EPM Cloud Services. It is built using the same code line as on-premises FDMEE. Cloud Data Management can integrate flat files. It includes all the on premises FDMEE transformation capabilities including SQL mapping which can accommodate complex transformations. It includes the prebuilt logic to natively load data to each of the Oracle EPM Cloud Service offerings. Cloud Data Management can integrate with other Oracle Cloud Service offerings including the ability to source data from and load data back to Fusion G/L Cloud. As well, it can source data from other Oracle EPM Cloud Services.

Variance between FDMEE and Cloud Data Management

While the main transformation and load capabilities of FDMEE are available within Data Management, some of the key features of an on-premises deployment of FDMEE have been disabled.

The below shown table highlights the accessibility of FDMEE features in Cloud Data Management:



Cloud Data Management

Pre-built connection to Oracle branded ledgers



Pre-built connection to SAP ERP and DW



Direct connection to relational data sources



Data Synchronization( Hybrid Mode)



Import, Custom and Event Scripting



Custom Reports



Flat File Processing



Pre-built connection to Oracle Fusion GL






Multi-Period Processing



Data Synchronization( Full Cloud Mode)






Drill Through




                       A key feature that is not available in Cloud Data Management is the ability to connect to on-premises systems. This applies to both the systems for which Oracle has created adaptors as well as those that require additional application configuration. For example, if an organization is utilizing Oracle E-Business Suite (EBS) or SAP General Ledger, Cloud Data Management is not able to connect to either of those systems. To assimilate data from on-premises systems to the Oracle Cloud Service products such as EPBCS or FCCS using Cloud Data Management, a process needs to be developed to transfer a flat file to the Cloud instance and then invoke a Cloud Data Management process. While this is certainly achievable using Oracle's EPM Automate command line utility, many organizations prefer to avoid flat file integrations when possible.

Role of EPM Automate:  Cloud Data Management is most often used in conjunction with a light weight on-premises command line utility known as EPM Automate. At a minimum, EPM Automate is required to transfer data files to the Cloud Service instance in which Cloud Data Management is deployed; however, multiple EPM Automate commands can be threaded together to create a lights-out end-to-end process for data integration. A data integration task flow may contain the following steps:

1. Login to the Oracle EPM Cloud Service instance

2. Upload a data file

3. Initialize a Cloud Data Management routine to process the data file

4. Execute calculations in the EPM Cloud Service product (e.g. EPBCS)

5. Execute a Financial Reports (FR) report in the EPM Cloud Service

6. Download the report output to an on-premises location

7. Log out of the Oracle EPM Cloud Service instance

8. Generate and send an email with the FR report attached

EPM Automate helps Cloud Data Management function as a somewhat more fully integrated solution for the Oracle EPM Cloud Services. The utility EPM Automate is not limited to Cloud Data Management; it can and often is used in on-premises FDMEE deployments as well.

Further Comparisons:

                        Scripting: Another feature of FDMEE that is not available in Cloud Data Management is the ability to use scripting. Scripting allows the FDMEE application to be extended beyond the standard out-of-the-box features. Scripting enables us to achieve common functions like connecting to a relational repository and extracting data or generating email status messages for lights out data processes. The scripting language that is used by FDMEE is either Visual Basic or Jython. Both of these languages have the ability to interact with the operating system, including that of the FDMEE application server. This creates a significant risk in a Cloud-based deployment. A malformed or malicious script could cripple an entire Cloud deployment. Because neither language has the ability to remove the specific functionality that is potentially harmful to the Cloud environment, Oracle has simply disabled all scripting ability for Cloud Data Management.

                        The inability to use scripting reduces the capabilities of Cloud Data Management. The criticality of data integration is generally a snubbed portion of an Oracle EPM project. Integrations can be complex as there are numerous systems from which data can be sourced in a variety of formats. While many systems can produce data files that can be easily consumed by Cloud Data Management, others produce files that require additional processing in order to be consumed. This is generally achieved through scripting. Since Cloud Data Management does not support scripting, any additional manipulation of a source system extract would need to be accomplished either by another application/process or through manual intervention. This is suboptimal and generally avoided by most organizations due to the added complexity or data risk. The scripting capabilities of FDMEE help to eliminate this risk.

                        Custom Reports: Another feature of on-premises FDMEE that is not available in Cloud Data Management is the ability to create custom reports. FDMEE and Cloud Data Management are one of the few products in the Oracle EPM stack that come with a number of preconfigured reports. These reports provide not only access to the financial data processed through FDMEE/Data Management but also to critical process and audit information including map modifications. While Oracle has done a remarkable job delivering reports with significant value-add, there are instances where the reports need enhancement. In other cases, where a new report is needed to address a specific requirement, unfortunately, Cloud Data Management does not provide the ability to change or author reports.


Unfortunately, there is some level of misunderstanding or misinformation in the marketplace about the capabilities of FDMEE and Cloud Data Management. One of the biggest misconceptions about FDMEE and Cloud Data Management is regarding the drill through to source system capability. The common belief is that drill through to the source system is only available for data that was loaded through on-premises FDMEE using one of the source system adaptors for E-Business Suite (EBS), PeopleSoft, J.D. Edwards, or SAP.

                        This is 100% inappropriate on two fronts. First, the adaptors are used solely to extract data from the source system. The use of the adaptor is undeniably separate from the capability to drill through to a source system. Drill through to source systems is available for any system which supports a web request for data, even those from which FDMEE or Cloud Data Management has not sourced data. Second, and to some degree in concert with the first point, drill through is absolutely available from Cloud Data Management to on-premises systems even though Cloud Data Management does not support the use of source system adaptors to integrate data from on-premises systems. While there are certainly design and configuration requirements to support drill through from Cloud Data Management, it is available and supported by Oracle.

Cloud Drive Considerations for Multi-Product EPM Organizations

The inability of Cloud Data Management to connect to on-premises systems also applies to Oracle EPM applications such as Hyperion Financial Management (HFM) or Hyperion Planning. Many organizations are 'walking' their EPM landscape to the Cloud. In other words, for those organizations that have multiple Oracle EPM products currently in use, the acceptance of the Cloud is done in a stepped fashion. For example, an organization may transition their Hyperion Planning deployment to EPBCS before transitioning their HFM application to FCCS. Many organizations have been comfortable adopting PBCS while choosing to wait for FCCS to continue maturing.

                        Another reason organization takes a stepped approach to the Cloud is that transitioning both financial close and planning processes at the same time creates risk. This risk can manifest itself in any of the three project levers:

                        Choice:  The idea of scope risk is that migrating multiple processes at the same time can be a long, complex project. While the mantra that the Cloud is faster and easier is often promoted, the reality is that a project in the Cloud can have just as many complexities as an on-premises deployment. Moreover, the prebuilt value-add content of the Cloud services can often mean an adjustment to the existing business processes used in the Oracle EPM applications.

                        Timeline: Moving multiple processes to the Cloud simultaneously definitely adds timeline risk. Moving a single process like Financial Planning allows an organization's resources to stay focused on a single project work stream. Often the key team members within an organization that are involved in an EPM project tend to overlap, at least in some fashion, even for processes as distinct as financial close and financial planning. Undertaking a project to move multiple processes to the Cloud concurrently requires these resources to spread their efforts across more than one work stream. This can lengthen the overall project timeline and add risk that project milestones are missed due to resource constraints.

                       Budget: Transitioning multiple processes to the Cloud concurrently can be more expensive than a multi-phased transition. One might argue that the costs should at least be equivalent or even less with a project that migrates multiple streams concurrently. The argument for lower overall cost would be attributable to the idea that certain project tasks such as testing and migration would benefit from concurrency and only needing to be performed once as opposed to across multiple projects. However, as noted above in relation to timeline risk, projects that migrate multiple business processes to the Cloud generally leverage more external resources (consultants) due to the nature of internal resource constraints.

                        As an outcome of these risks, establishments often find it advantageous to separate, the move to the Cloud into multiple projects that do not run in parallel. This means that an organization will be operating in a hybrid mode - a mix of on-premises and Cloud applications. Often, there is a need to exchange data between the close solution and the financial planning solution. Cloud Data Management enables the exchange of data between Oracle EPM Cloud Services (e.g., FCCS à EPBCS); however, it does not provide native functionality to achieve this data exchange in a hybrid mode. By contrast, on-premises FDMEE natively provides the ability to exchange data between on-premises Oracle EPM systems and Oracle EPM Cloud Service products. While processes can certainly be created to allow the exchange of data between on-premises EPM systems and EPM Cloud Service applications, it requires custom development that often requires in-depth knowledge of the application programming interface (API).

Judgment Aspects

For an organization facing the choice between on-premises FDMEE and Cloud Data Management, there are a variety of factors that contribute to the final decision.

Software Possession

                        First, an organization needs to determine if additional software needs to be procured. For organizations that currently have the Oracle EPM stack deployed on-premises, especially HFM, there is a high likelihood that FDM Classic (prior generation of FDMEE) is already owned. Any organization that has licensed FDM Classic is entitled to FDMEE without any additional licensing expense. FDMEE licensing is fairly straight forward. The software is licensed on a per user basis by specific functionality. Every organization that procures FDMEE needs to pay for the software itself. This portion of the licensing gets you the software but does not include any of the functionality to connect to source systems using the Oracle preconfigured adaptors or to the Oracle EPM products. Each of these things is licensed separately. In addition to the core software license, there are three licensing options in the FDMEE functionality tier:

1.HFM Adaptor: Provides the logic needed to natively integrate with HFM on-premises deployments

2. Adaptor Suite: Provides the logic for multiple integrations. First, includes all of the Oracle source system preconfigured adaptors (e-BS, PeopleSoft, J.D. Edwards E1). Second, includes the ability to integrate with on-premises Hyperion Planning and Essbase applications. Finally, the adaptor suite provides the ability to integrate from an on-premises FDMEE deployment to the Oracle EPM Cloud Service products that leverage an Essbase data repository i.e. PBCS, FCCS, & PCMCS. TRCS is also on the roadmap. While ARCS does not leverage an Essbase data repository, on-premises FDMEE utilizing scripting can integrate with it. The adaptor suite must be licensed to integrate with the Oracle EPM Cloud Services.

3. SAP Adaptor: Provides the ability to source data from SAP General Ledger (ECC or R3) as well as SAP Business Warehouse (BW). This source adaptor is licensed separately because Oracle needs to pay royalties to the partner (BristleCone) that developed and maintains the adaptor. The need to procure FDMEE can certainly be a restrictive especially for organizations that are attempting to adopt a pure Cloud model.

                        There are numerous factors that should be weighed before abandoning the idea of procuring on-premises FDMEE:

Figure of Cloud Services

                        Another important decision point in the choice between on-premises FDMEE and Cloud Data Management is the number of Oracle EPM Cloud Services to which data needs to be integrated. This concern is part of the overall EPM integration strategy for an organization. For organizations that have or anticipate having multiple Cloud services (e.g., PCMCS and PBCS), on-premises FDMEE is worth exploring.  Integrations in Cloud Data Management are specific to the instance to which Data Management will load data. For example, if Cloud Data Management within the PBCS instance is used to load data to PBCS, that Cloud Data Management application cannot be used to load data to the PCMCS instance. A completely discrete Cloud Data Management application within the PCMCS instance would need to be created in order to load data to PCMCS.

                         An extra layer of complexity with Cloud Data Management is when data needs to move between instances; for example, loading actual consolidated results from FCCS to EPBCS. The Data Management application within the FCCS instance cannot push data to the EPBCS instance; data must be pulled from FCCS using the Cloud Data Management application within EPBCS. Conversely, if budget or forecast data needs to be loaded to FCCS, the pull of data from EPBCS would need to be initiated from the Cloud Data Management application within the FCCS instance. The need to exchange data between applications highlights a current shortcoming of Cloud Data Management. The user initiating the action must always evaluate from which instance the integration needs to be performed and log in appropriately. While this is not a show stopper, it is definitely a training issue and one that could be viewed as a suboptimal end user experience.

                        Lastly, it should be noted that the inability to share core metadata (including security) objects across Oracle EPM Cloud Service instances results in duplicative maintenance of those items across the multiple Cloud Data Management applications. On-premises FDMEE by contrast has the ability to connect to multiple Cloud instances as well as on-premises EPM environments from a single FDMEE application. This allows data to be loaded to and exchanged between applications using a single, common application. Since on-premises FDMEE is a single application, core application elements including security can be shared across different integrations.

Difficulty of Integrations

                        Below are some questions that need to be considered when deciding between on-premises FDMEE and Cloud Data Management:

                       • Will my integrations need to be fully automated including email status alerts? If so, then on-premises FDMEE will be preferred.

                       • Are my data files in a consistent format? This means that I can import the raw data file into Excel and each field is populated consistently in each data row? If so, Cloud Data Management will very likely be able to process the file. If not, FDMEE scripting may be required to properly process the data.

                       • Does the organization expect the need for custom reports? This is hard to know with a high level of certainty. For large organizations that have vigorous audit requirements and users with highly specific reporting requests across other EPM products, it is likely that the ability to generate custom reports will be necessary, and on-premises FDMEE would be required. The majority (80%+) of organizations find the out-of-the-box FDMEE/Data Management reports to sufficiently address their needs.

True Cost of Ownership

                      To properly calculate the true cost of ownership of FDMEE or Cloud Data Management, we need to consider not only the financial expenditures but also the human capital expense associated with owning and maintaining a solution. Often Cloud Data Management is presented as having a lower total cost of ownership. From a financial expenditure perspective, this is very likely true.

                        First and foremost, Cloud Data Management does not introduce any incremental software cost. The Data Management module is included in the subscription price for the Oracle EPM Cloud Service instance. While the Cloud subscription is a recurring cost similar to annual maintenance, this cost will be paid regardless of the integration approach used, and as such, the initial software cost and on-going annual maintenance cost of on- premises FDMEE must be considered. Second, an on-premises FDMEE deployment requires additional hardware. This hardware can be physical or virtual. It can be within the data center of the organization, or it can be installed on hardware owned by an Infrastructure as a Service (IaaS) provider such as Oracle. Depending on the hardware deployment method (physical/virtual), there is capital expense and depreciation expense or operating expense that must be incurred.

                        Finally, the Total Cost of Ownership analysis often includes a discussion of the cost of future advancements which can be a somewhat flawed analysis. While the prior two components are certainly valid and factor into the upgrade discussion, the upgrade point is somewhat twisted. First, Oracle has stated that there will be one more major release to the on-premises software (version 11.2) that will be supported through December 2030. Any version of on-premises FDMEE that would integrate with the Oracle EPM Cloud Services would require the most current version which is This means that there would only be one significant version upgrade of an on-premises FDMEE deployment. While patch set updates will be released at certain intervals, organizations can defer an upgrade or patch set update until a time that aligns with their needs and business processes.

                        Conversely, the Cloud is patched every month with new functionality being introduced. These updates do not always apply to the Data Management module, but when they do, a certain level of regression testing befits an organization to ensure the patch did not impact any functionality currently in use. Moreover, a full version upgrade of the Cloud has yet to occur. When such a major upgrade does occur in the EPM Cloud Services, an organization will certainly need to perform the same level of testing to the system as one would with an on-premises deployment. The core basis for challenging the cost of Cloud upgrades being lower is that the Cloud needs to be tested more frequently. Each time the Cloud is patched/updated, there is time required to evaluate if testing is needed and if so, to actually execute the test. An organization cannot defer the Cloud updates for more than one or two cycles and as such will consistently need to test. In contrast, an on-premises FDMEE application can be upgraded on a cycle defined by the organization. If no new functionality or bug fix is required, an upgrade can be delayed forever. As a result of the more frequent and mandatory testing cycles, the true cost of upgrades in the Cloud is higher because the administrators are more frequently undertaking testing activities.

                        Software and hardware costs are an important part of the TCO analysis; however, the analysis should also include the human cost. As previously noted, Cloud Data Management lacks certain features - direct connections to on-premises systems, automation email alerts, and the bi-directional hybrid or Oracle EPM Cloud Service data movement. The lack of these features often means more manual maintenance and workflow executions for administrators and end users. This erodes productivity and certainly has a cost associated with it. Moreover, the confusion of which Cloud Data Management application to use for data movement between the EPM Cloud Services can frustrate end users and administrators. This suboptimal experience can impact the perception and perceived value of the EPM Cloud Services and therefore should also be considered in the TCO analysis.

Logistic ETL Criteria

                        Some organizations have a distinct ETL standard that is usually a system such as Informatica or ODI. Prior to the arrival of the Cloud, these standards would sometimes not be applied to the Oracle EPM suite of products and, in particular, Hyperion Financial Management (HFM) since FDM Classic and later FDMEE were purpose-built to allow end users to integrate data to these systems. The on-demand need for data throughout a financial close cycle was often better served by FDM/FDMEE. With the introduction of the Oracle EPM Cloud Services, this decision must again be assessed particularly as it relates to maintaining on-premises software. The necessities that drove the use of FDM Classic or FDMEE - drill back, on-demand execution, end user maintenance of transformation logic (mapping) - are certainly still valid; however, the features that Cloud Data Management lacks may be improved by an existing ETL tool.

                        Consider this example: an organization needs to integrate data from an on-premises Oracle PeopleSoft general ledger to PBCS. One option would be to utilize on-premises FDMEE and its native connections to PeopleSoft to source and load the data to PBCS. Utilizing batch automation and scripting, an email status report would be generated. The process would be able to be initiated on-demand or scheduled to run at set intervals. But if the organization has not procured on-premises FDMEE licenses or is looking to abolish the licensing, then the ETL tool on which the organization is standardized may be used in concert with EPM Automate and Cloud Data Management to achieve an FDMEE-like integration.

A solution using the latter may look something like the following:

1. ETL tool executes a procedure to query data from PeopleSoft and generate a flat file.

2. ETL tool uploads the output to the Cloud Service instance using EPM Automate.

3. ETL tool initializes a Cloud Data Management workflow process using EPM Automate.

4. ETL tool executes a Cloud Data Management report to determine workflow status using EPM Automate.

5. ETL tool downloads the report output using EPM Automate.

6. ETL tool emails status report to required recipients.

                        The latter option is nearly on par with an on-premises deployment of FDMEE with a couple of cautions. First, the sourcing of data from PeopleSoft is provided as an out-of-the-box solution with FDMEE whereas in an ETL/Cloud Data Management approach, the extract query would need to be defined and maintained by the organization. Second, the level of detail that could be included in the email status generated by an ETL tool would not be as detailed or robust as an on-premises FDMEE solution. There are multiple reasons for this.

                        First, EPM Automate has limited return codes. The execution of a Cloud Data Management workflow process returns a success or a failure like most command line executions. However, success in this instance does not indicate that the workflow process completed without error; it simply means that the initialization of the workflow process was successful. As such, in order to determine the status of the workflow execution, a Cloud Data Management report needs to be run. Second, the workflow status report will provide workflow status, but it will not include detailed error information in the event of a failure in the workflow process. This information is housed in the Cloud Data Management relational repository, but reports to provide that information are not currently available, and as previously noted, custom reports cannot be created for Cloud Data Management. In contrast, an on-premises FDMEE deployment provides not only the ability to create custom reports but also access to the relational repository in which the detailed error information is contained. FDMEE scripting or custom reports can be used to provide this more detailed and actionable information to the appropriate users.


                        Data incorporation within the Oracle EPM Cloud Services is an extremely important topic. We explored the on-premises and Cloud-based options, including the features and functionality of both as well as important considerations such as the total cost of ownership. The last remaining question to answer is: which tool should my organization use? Unfortunately, there is not a singular answer to that question.

                        On-premises FDMEE functions as a data hub within the EPM environment. Data can flow to and from on-premises applications (ERPs, Data Warehouses, on-premises EPM systems) seamlessly to and from the Oracle EPM Cloud Services. Integrations can be fully automated and centralized to a single touch point. This functionality; however, comes at an additional cost to an organization. Conversely, Cloud Data Management has no additional software licensing costs or infrastructure overhead. That said, there is a human capital cost of ownership associated with the reduced features and functionality of Cloud Data Management.

                       Organizations with straight forward integration requirements - no automation alerting, consistent file formats, no integration to on-premises systems into a single Cloud service may find the 'built-in' nature of Cloud Data Management to be convincing. Organizations that have a multitude of integrations with varying degrees of complexity, a need to integrate from on-premises systems including Oracle EPM products, a need to restructure integrations through advanced automation, a need to integrate data into multiple Oracle EPM Cloud Services, or any combination of these factors, will certainly favor an on-premises deployment of FDMEE.

                        Any organization facing the decision of which integration toolset best addresses their needs should consider each of the factors highlighted above and weigh them against the financial and human costs of each potential solution.

Continue reading " POV: FDMEE vs Cloud Data Management " »

February 19, 2018

Chatting with Bots - More necessity than a science fiction

In the age where there are multiple applications involved in supply chain process, the knowledge about the customer orders is distributed. It has become a walk on the tight rope to keep the customer updated about the process of their order Vs cost to provide the information to the customer via customer service team or a complex BI solution. This blog opens a possibility of cost effective and light weight solution by introducing the 'Chatbot'.

The IT landscape involve multiple applications to fulfil every single order due to the nature of business, way the organization have evolved, number of business entities involved or due to the speciality of the applications. Below is the example of a manufacturing and retail organization

Pic 1 - Typical IT landscape

In this complex matrix, the traditional methods to keep the customer updated about the progress of their orders are as follows

  • Send text message or email about the status
  • Set up a customer service team to handle customer requests via call, text, email or chat

But the drawback of these conventional methods are that there is no single system which holds the moment of truth about every order. In order to avoid the customer service team juggling between applications, a complex BI reports are installed to oversee all applications resulting in even more complex IT landscape.

Alternative solution is that 'Chatbot'. According to Wikipedia, a chatbot is a computer program which conducts a conversation via auditory or textual methods. Customers can chat with Chatbot to get the information about their orders. Let's see why the Chatbot solution is cool.

Implementing the Chatbot:


There are 2 main functionalities of Chatbots:

  • Receive and understand what the customer is saying, and
  • Retrieve the Customer information required

 In order to receive and understanding what customer is saying via chat, Chatbot uses Natural Language processing systems. Via artificial intelligence and machine learning, Chatbot is trained to understand the customer's request better. There are numerous cloud based chatbot development platforms can be leveraged to design, build and train the Chatbots. Oracle Cloud Platform or IBM Watson are examples of such Platform as a Service (PAAS)  solutions available.         

Text Box:  
Pic 3 - Example of a chat conversation in mobile
For retrieving the information required, the Chatbot uses web services to connect with each application. For example Order management Cloud has an Order Import Web service which can be involved by using the retail order number. Similar order information web service can be created. The Chatbot will have to invoke the web service and find out the best status of all the application and publish it to the customer.

Via these NLP and web services, implementing a Chatbot solution is easier than ever.

These Chatbots are not too bulky and intrusive like traditional BI solutions. They occupy less space in server or can be easily placed in Cloud as well.

Customer Experience:

Customer Experience, in short CX, is a major focus area for the organizations. With referral customers giving more business than new customers, the organization want the customer to be handled with care. The Chatbot will give the customers an unparalleled experience just like chatting with a human.

The Chatbot can chat in different language as preferred by the customer. In addition, Chatbot can be trained to reply on text or voice commends as well.

The Chatbot can be used on computer, tab or even mobile to give customer an excellent convenience.

Capex, What Capex?

 Setting up a multi-language enabled customer service team 24 x 7 or implementing a complex BI solution is far more costly for the organization. The cost and time to implement a Chatbot is far less when compared to the traditional methods. Readymade Chatbots are available which are already designed and built to a general extend. The implementation will be limited to involve the order information web services from various application and to train the Chatbots.


The Chatbots can also be used for expediting an order if customer requires. Chatbot can send mails to the Production team in manufacturing facility with the chat history to ensure that the order is expedited.

With the technical advancements, Chatbots are even helping patients who suffer from Alzheimer's disease and insomnia.

To summarize, Chatbots are easy, simple and light weight applications that solve the major problem of keeping the customer engaged. So if you are chatting on a web site to know the status of your order, you may be chatting with a robot already!!!

January 22, 2018

Vroom Vroom... with the Infosys Automotive Solution

Automotive Industry has been largely ahead of the innovation curve bringing in more technology to the vehicle towards the needs of the market. But all this while, they were challenged working with their own archaic systems. Good customer experience does not just mean good client facing applications but also the entire supply chain has to be customer oriented. Each of the supply chain elements need to be integrated to the get the part/vehicle at the right place at the right time.

Writing in fear of being cliché, an Automotive supply chain has its own complexities which sometimes are not as intuitive to anyone who does not live and breathe this industry. This is where Infosys Automotive Solution has been crafted and perfected over the years, to cater to such specific supply chain challenges.

1.       Supersessions: This is where the rubber hits the road. Almost every leading ERP product in the market has functionality to define supersessions but is it integrated to the entire process?? The answer will be "No".  The complexity does not end with ensuring we are selling always the oldest part of the supply chain but we are also buying the latest part in the chain. Ensuring the End of life and forecasting processes for the product chain are tied together. Even from a pricing perspective, how is the solution going to align the prices along the chain or create incentives for driving buying behavior from dealers?

2.       Referrals: Referral is a concept beyond Promising. How does one ensure we refer to the next nearest warehouse to meet the demand to ensure customer experience does not take a hit? While doing this, how do we keep the logistics cost minimal? How do we ensure we follow the milk run routes or do rate shopping real time? How do we ensure routes are combined together? While traditionally these problems are solved through transport integrations but many have solved this problem too much downstream.

3.       Fair share: When we are in a back order situation in the entire network and there are continuing supply constraints, how do we ensure that the incoming supplies are transferred and is fair shared across all distribution centers. Should it be based on FIFO, or customer priority etc.? These are problems that applications have continued to ignore putting these as execution problems.

4.       Slotting: Warehouse space is real state, how do we ensure that the fastest goods are always picked fastest. Also will the fastest always remain fastest? Or will there be seasonality, trends which we have to cater to. Slotting is ensuring that a continuous proactive process.

5.       Dealer incentives: This is an important part of the supply chain, often ignored. Supply chains are like humans, unless we build in incentives, we won't be able to drive the required behavior from the supply chain constituents.  The big question would be what should we stock at dealer inventories which are client facing and what we stock at middle level warehouses vs central warehouses. At the end of the day, inventory budget and customer service levels will drive the decisions but a dealer would only be concerned about their own profitability.

While we covered some of the nuances of the automotive spare part supply chain, there are many more such niche challenges which are unique and have been built in Infosys Automotive Supply Chain Solution. The solution not only covers the spare part supply chain but also caters to vehicle business as well. Additionally, we have solution flavor catering to Tier1 suppliers as well. To know more, reach out to us at Oracle Modern Supply Chain event at San Jose.@ OracleMSCE @Infy from 29-31st January 2018.

December 29, 2017

Comparative Study Between Oracle Big Data Cloud Service and Compute Engine


Comparative study between Oracle BDCS and Oracle Big Data Cloud Compute Engine.


1.             Oracle Big Data Cloud Service: Gives us access to the resources of a preinstalled Oracle Big Data environment, this also comes with an entire installation of the Cloudera Distribution Including open source Apache Hadoop and Apache Spark. This can be used to analyze data generated from Social Media Feeds, E-mail, Smart Meters etc.

OBD CS contains:

·         3-60 Nodes cluster, 3 is the minimum number of cluster node(OCPU) available to start with; where we can increase the processing power and secondary memory of the cluster node can be extended by adding Cluster computer nodes("bursting").

·         Linux Operating System Provided by Oracle

·         Cloudera Distribution with Apache Hadoop (CDH):

-          File System: HDFS to store different types of files

-          MapReduce Engine (YARN is default for resource management)

-          Administrative Framework, cloud era manager is default

-          Apache Projects e.g. Zookeeper, Oozie, Pig, Hive, Ambari

-          Cloudera Application, Cloudera Enterprise Edition Data hub, Impala Search and Navigator


·         Built-in Utilities for managing data and resource

·         Big Data Spatial and Graph for Oracle

·         Big Data Connectors for Oracle:

-          Oracle SQL Connector for HDFS

-          Oracle Loader for Hadoop environment

-          Oracle XQuery for Big Data

-          ORE Advanced Analytics for Big Data

-          ODI Enterprise Edition


Typical Workflow of OBDCS: Purchase a subscription -> Create and manages users and their roles -> Create a service instance -> Create an SSH key pair -> Create a cluster -> Control network access to services -> Access and work with your cluster -> Add permanent nodes to a cluster -> Add temporary compute nodes to a cluster (bursting) -> Patch a cluster -> Manage storage providers and copy data

odiff (Oracle Distributed Diff) is a Oracle developed innovative tool to compare huge data sets stores sparsely using a Spark application and compatible with CDH 5.7.x. Maximum file/directory size limit is 2 G.B. to compare.



Continue reading " Comparative Study Between Oracle Big Data Cloud Service and Compute Engine " »

November 9, 2017

Transaction matching of around two million records in under 5 minutes in ARCS

Oracle Account Reconciliation Cloud Service (ARCS) with Transaction Matching is a cloud based reconciliation platform with pre-built configurations and adherence to industry best practices; a recommended solution to cater to your reconciliation and matching needs.

Transaction Matching is a module within ARCS which inherits the features that facilitate preparation and review of reconciliations.

  • Additionally, Transaction matching adds efficient automation of the detailed comparison of transactions in two or more data sources
  • The calculation engine allows for intuitive "netting" of transactions within each data source to provide output which is easy to work with
  •  Flexibility in the setup and timing of the process allows to minimize the effort during "crunch time" and reduce risk


Transaction Matching Use Cases

Typical Transaction Matching Use Cases are shown below.


Use Cases.jpg

Often clients need to match more than million records between two source systems with complex match set rules. We have seen clients spending hours to try to manually match them in excel or use some solutions like Access database, Oracle tables etc. which can be very time consuming and have data quality issues. We will share our experience and some insights on how we successfully loaded and matched two source files with around 2 million records in less than 5 minutes using Transaction matching feature of ARCS for one of our e-commerce client.

Idea Inception

Client wanted to match up to 2 million records from their point of sale system (POS) and the details obtained from Merchant transaction system. They were using access data base for this activity which was giving them results in hours and they reached out to Infosys with this requirement to help them streamline this time-consuming and frustrating process.


Solution and Approach

Source Files.

1. Point of Sale transaction file.

    The POS file had 9 columns and the file provided was in txt format (a pdf report converted into a text file). Below is the snapshot of the same.


2. Merchant system transaction file

           The Merchant system transaction file had 21 columns and the file was in csv format. Below is the snapshot of the file.


Matching rules

Client wanted the matching rules to be based on the condition that the card number and the amount from POS transaction file matches against the cardholder number and amount from the Merchant transaction file with the stipulation of many to one transaction match where many transactions from Point of Sale system matches with single batch (grouped by amount) transaction from Merchant system file.


Initial Challenges

The initial challenges with this requirement are below

1. Size of File.

    The size of the files provided were huge as there were 9 and 21 columns respectively and both the files had around 2 million records resulting in file sizes of > 1 GB per file. This much large a file is difficult to read and edit by any text editor.

2. Formatting

    Another bigger challenge was formatting the given files as per ARCS transaction matching needs. The files provided were in text format and to read and format them given their file size was a tough nut to crack.


Infosys Solution

We took this challenge and delivered as promised. The biggest challenge was to import the file containing about 2 million transactions into the ARCS Transaction matching from both the system and match them automatically in quick time. Other tools and custom solutions were taking hours for this process. Importing 2 million records in a csv file is a huge input for any system to ingest. It would typically take anywhere between 15-30 minutes just to import one file into a system. We had another challenge in formatting the files because the file we received was a .pdf file converted into text format and we needed them to be converted into .csv to be accepted by ARCS Transaction Matching. We used Oracle ARCS TM, formatting tools, text editors and Oracle provided EPM Automate utility to format the files, automatically ingest and auto-match the files from two transactional systems.


The EPM Automate Utility enables Service Administrators to remotely perform tasks within Oracle Enterprise Performance Management Cloud instances and automate many repeatable tasks like import and export metadata, data, artifact and application snapshots, templates, and Data Management mappings.


Tips and Lessons Learnt

With the above requirement's implementation, we have learned a few lessons and below are some tips when implementing similar type of solution.

  • ARCS TM also accepts .zip format input files, hence compress the files into .zip format so that they are smaller in size plus quick and easy to upload on the ARCS cloud.
  • Powerful text editors like Notepad++ or Textpad when formatting the files, could be used.
  • Create custom attributes which can be used in matching rules for faster auto-matching of transactions.
  • If possible, try to get the export from the transactionsystems in .csv format to reduce conversion times.

Performance Metrics

Below are our performance metrics while implementing client's requirement of matching around 2 million records using Oracle ARCS Transaction Matching.


Import POS million records - 27 seconds

Import Merchant million records - 61 seconds

Run Auto Match - 53 seconds


Complete Process - 2 minutes 21 seconds (Less than half of 5 minutes)




Happy client and Happy us.


We deliver!!!! - Please visit our company website to know more about our Account Reconciliation and Transaction matching solutions.

September 28, 2017

OBIEE: An effective tool for quality control in Credit Bureau Reporting by Auto-Finance Companies

Auto-finance Organizations in US have to report the credit data of their customers every month to Credit Reporting Agencies (CRAs) i.e. Experian, Equifax, Transunion and Innovis to comply with FCRA (Federal Credit Reporting Act) of US law. For this they have automated software programs in place which extract the account and consumer data from their source systems and transform/ load the data as per defined business logic into data warehousing tables before it is finally sent to CRAs in the format of Metro 2 files. This process is called 'Credit Bureau Reporting'.

Continue reading " OBIEE: An effective tool for quality control in Credit Bureau Reporting by Auto-Finance Companies " »

September 25, 2017

Why Knowledge Management in Enterprise?

KM Solution in Enterprise Eco System is a necessity than just an additional tool. When world is talking about automation, digitization, what and what not; KM Solution has huge contribution in enabling enterprise achieving goals and meeting objectives. This Quadrant explains where KM can really help Enterprise; Knowledge Sharing, Collaboration, Self Service, and Agent Productivity.  


Knowledge Sharing: 

Knowledge Sharing" is a huge problem across industry, team, business unit, you can keep adding area in the list. Small to Large companies spend millions of $$ training their employees every year and it keep increasing by almost 1% every year. Research also says with class room training only 10% skills can be enhanced. So, what about rest of the skill development. Those come from On-The-Job experience. Can we just deploy people on critical assignments post-classroom training? Answer is Yes and No. "Yes", because that's how you make him/her learn on-the-job. "No", because you cannot leave him/her alone and/or expose to customer. You need to mentor, coach, and provide right feedback at right time.  This is the issue in Service Industry, Corporate, Product Companies and any other. So, the question is can KM solve this issue? Probably yes. It gives a platform where people can learn, get help while on-the-job, mentor/coach can spend their some of the time in doing other productive work.



The idea of collaboration is not new. We see people talking about it all the time and at all the places. But the question is how does this help, why is it so important and most importantly how KM helps? Too many questions and answer is also not very straight. Collaboration is something which has hidden return. This is probably something which has hidden return. This is not something which can be measured but certainly experienced. When we talk about team; can each one in the team work alone and at the end combining individual's work will complete the assignment? In most of the cases no. So, collaboration is required. When we say team, its collaboration, work together and make sure that team is moving in right direction from day one. Now what is KM? It's knowledge repository, brain dump of each and every one to help the mass. So, shouldn't Knowledge Base be an invaluable asset. Enabling collaboration feature in KM Solution helps contributor collaborate with peers, SMEs and other colleagues get their view point, feedback and input to ensure what is going in repository is thought through not something which is just written by someone for no reason. Collaboration in KM world brings KM maturity.

Self Service: 

Companies spend unaccounted dollars to make sure customer is "Happy". This is not an easy task. There are so many delivery systems have been researched and implemented over the period to make customers happy. KM is one of them. KM has huge role in customer service. Customer can call to helpdesk, you can deploy very intelligent people in handling customer's call. This is all good but wouldn't it be good to avoid getting calls from customers at first place? Yes, it is easy to say but the question is how? Self Service is one of way to overcome from this problem. This is also called as "Case Deflection". Let customer find the right information by delivering information through omni channel. Deflect the problem by exposing the information they need. If you have mature KB repository it's difficult and brings huge value add in customer service.

Agent Productivity: 

Companies spend unaccounted dollars to make sure customer is "Happy".Other side of Customer Satisfaction is how to increase the Agent's productivity. How to make sure that Agent is spending less time resolving the customer issue and at the same time able to handle more and more calls. One way is to deflect the customer problem by enabling Self Service. But it doesn't work alone. Agent should equally productive. To do so, KM should also be exposed to Agent. Agent actually needs little more than just exposing KM to them. It has to be Contextual. We don't want agent to scan through entire repository to look for the right answer of the problem customer is facing. Rather, system should be intelligent enough understand the context of the problem like who is the customer, area of concern, etc.Agents are the best contributors in maturing KM repository. They have tons of information within by working in diverse situation, so enable KM to agents to increase their productivity, get best ROI, make customers happy and progressing towards matured KM repository. Finally, these are said at many places and many times but still we have Industries and Companies lacking in this area. KM is an integral part of Automation, Digitization, and Customer Experience.

Who Can use It?

"Who needs Knowledge Management and Why?" seems difficult question but answer is simple "Everyone needs it". Every Enterprise needs Knowledge Management irrespective of industry they belong to. Below is why? 

Thank You!

Shubhra Sinha

February 28, 2017

B2B CPQ eCommerce Solution for Professional Service Sector

A leading Professional Service provider is evaluating their Business Model to enable eCommerce for business sales. More importantly it is being assessed during transformation phase from On Prim to Cloud. To me, it really looks like a game changer move to evolve eCommerce DNA right at the inception of platform change. It would not really matter if the eCommerce solution is implemented immediately or at a later stage. Point is that Business has cleared its intention and solution should be designed to give configuration and fulfillment experience to End Customer even if it's at a later stage. Oracle CPQ fits perfectly for their quote to order journey with its capability to accurately manage the innumerable factors that come with current B2B dealings. Blog aims to brings forward all the reasons as to how CPQ factor into all this


·         Uniformity across platforms eases the usage wherein end customer can start on a mobile device and finish it on a tab or a desktop. CPQ provides consistent user experience and empowers easy usage across channels

·         Attractive pricing is the key for great conversions. End customers can be grouped under various criteria and offers can be given based on preemptive planning on pain areas. Targeting the right customer at right time is the end target. CPQ offers part pricing, attribute and complex formula based pricing. Additionally end customer has the capability to compare prices of products and services selected to make the appropriate decisions.

·         Guided Selling is a strong feature that will navigate end customer through complex product offerings in a simplified way; concurrently pitching for up-sells and cross sells. These are achieved though recommendations, hiding, constraints rules based on organizational requirements and best practices to be followed as learnt from experiences. Intelligent recommendations become crucial to success.

·         Large chunk of transactions are repeat orders where end customers can review their transaction history from transaction page in CPQ.

·         Full featured shopping cart ease the usage with one click checkout and easy modification of quantity of offerings selected

·         CPQ also provides easy access to various finer details essential for a typical eCommerce flow including shipping methods and payment terms

·         Personalized branded proposals can be done as per organizational needs

·         CPQ Cloud supports integration with industry leading provider of electronic signature solutions DocuSign where documents can be electronically managed with real time updates back to CPQ.

·         Payments are also supported where in CPQ eCommerce Transactions Cloud Service can integrate with various payment dispensation players.


Culturing Do-It-Yourself ethos leads to higher realized margins where Oracle CPQ is offering easy and high quality buying. To add on top of it, Oracle CPQ provides eCommerce APIs to integrate with Commerce Cloud in case you want to continue with it to avail best benefit of both worlds. In a nutshell, let the Customers buy in a way they prefer - via Sales Agent, self-service or Partners and in a channel they prefer! Rapidly growing Oracle CPQ surely enables an intuitive customer experience in all dimensions!

September 8, 2016

Internet of Things (IoT) in field service management (FSM)

In today's competitive world, real-time data and innovative service methods are vital for field service enterprises to ensure customer delight, increase revenues, and expand profit margins.

The IoT explained

The Internet of Things (IoT) allows machines to communicate with each other (M2M communication). It is built using a combination of networks that comprise of data-gathering sensors, devices, big data, analytics, and cloud computing, which communicate via secured and encrypted channels. Connected devices enable efficient predictive maintenance by constantly providing information on a machine's performance, environmental conditions, and the possibility of failures. IoT can connect machines on the field in order to record incidents in real-time into a semi-intelligent 'Gen-X' FSM system.

Integrating IoT with FSM software applications

Field service organizations always strive to consistently provide the best service experience to their customers, by ensuring immediate repair and maintenance of their equipment and machinery. By collecting data about the machine's health and performance from IoT sensors, organizations can leverage predictive and preventive field service to minimize device downtime.

Three primary traditional FSM challenges

Here are three primary issues that challenge the current reactive scenarios:

    Field technicians execute the job and fix the equipment after the issue is reported. However, the delay can impact business continuity, which in turn affects the operating profit margins

    Adding more field technicians and service trucks to the field comes at a cost and sometimes the increased capacity remains under-used

    Assigning more work to existing field teams can have a negative impact on SLAs and first-time fix rates. Even worse, it can increase the cost of travel and overtime

Essentials of a new-age FSM solution

A field service management system that integrates device sensor data, technicians, customers, and technology is the key to address these issues. It should function in a predictive and preventive mode with the following features:

    The FSM process, which includes issue identification, communication, incident creation, scheduling, and assignment can be automated, thereby ensuring zero disruption in machinery operations and no or negligible downtime. This not only increases productivity, but also expands operating profit margins


    Most FSM products can also automate incident creation, scheduling, assignment, and invoicing processes. Using IoT, we can predict upcoming issues based on sensors data analysis and auto-creation of incidents based on preset threshold rules

The workflow of a FSM system with IoT integration

Here is an outline of the flow of incidents in a typical IoT-enabled FSM system:

1.   Data from the equipment's sensors is collected and transmitted, using secured and encrypted channels, to a big data storage

2.   Big data management and analytics is used to parse and analyze for refined sensors data

3.   The IoT command console is configured with predefined threshold rules to identify errors and monitor the device's health and performance

4.   Incidents are auto-created in the FSM system whenever errors are detected

5.   Auto-scheduling, routing, and dispatching of field service technicians against the incidents is done based on customer entitlements, location, product, skills required for the job, technician's availability, parts availability, etc. via the FSM system

6.   A field technician performs the job at the customer's site; records the effort, parts used, travel time, and any expenses incurred; and then bills the customer

Workflow of Field Service Management application using IoT.

Six Solution benefits

Wind turbines: A case in point of how IoT integrates with FS systems

Failures in wind turbines interrupt power generation leading to lower productivity and higher system downtime, which result in varying energy production and higher operating costs. To maintain profit margins, higher efficiency and uptime are required.

Near-real-time analytics provides data so that FS teams can react faster and address the issues before they become mission critical, thus reducing impact and avoiding downtime.

The wind turbine's sensors collect real-time data that is analyzed and through which, auto incidents are created, service scheduled, and an agent assigned to fix the issues. Wind turbine sensors are also used to continuously collect operating temperature, rotor acceleration, wind speed and direction, and blade vibrations - all of which can be used to optimize the turbine's performance, increase its productivity, and execute predictive maintenance to ensure reduced downtime.

*** Authors: Haresh Sreenivasa and R.N.Sarath Babu **

Continue reading " Internet of Things (IoT) in field service management (FSM) " »

September 30, 2015

Enterprise DevOps and Oracle

Various organizations dread the traditional development execution cycles because applications take longer time to deploy, become more turbulent, and continually get worse. IT operations are mired in firefighting mode all the time. DevOps is the perfect blend of the software development and IT operations.

Organizational Stakeholders have different needs on their minds. Management demands faster time to market and quicker turnaround, Operations team ought to have ease of operations and flexible environments, Developers and Testers are constantly striving for efficiency and visibility.

DevOps is the most appropriate answer to satisfy these stakeholders with a Continuous Delivery Approach.

Continue reading " Enterprise DevOps and Oracle " »

April 23, 2015

Accounting and Finance Shared Services- Oracle Fusion Capabilities for Record-to-Report and Analytics


Accounting and Finance Shared Services have typically been formed in large multinational organizations with the objectives of cost efficiency, standardization and operational effectiveness towards the customers within the organization. Some of the business processes handled by F&A SS are:

-          Procure to Pay

-          Order to Cash

-          Record to Report

-          Financial Planning, Consolidation, Budgeting and Forecasting

-          Treasury and Cash Management

-          Audit and Control

-          Analytics on the financial data

-          Financial closing and Financial statement preparation

Over the periods; instead of merely being a cost center, shared services are emerging as strategic business partners- offering innovation and value addition to various business functions. The extended objectives of the F&A Shared services include extensive analytics and reporting.

My previous blog had stated the Oracle Fusion capability for Procure-to-Pay cycle. It can be accessed here.

The General Accounting- Record to Report Cycle, Consolidation, Reporting and Analytics- requirements of a Multinational, Multi-GAAP organization; using best-of-breed or legacy applications for sub-ledger transactions; can be catered by Oracle Fusion Accounting Hub (FAH) and Fusion General Ledger from Fusion Financials suite. The Financial Reporting Center of FAH involves embedded multi-dimensional Essbase cubes and offers state-of-the-art reporting and analytics which enables multi-dimensional analysis of various transactional and accounting data.

FAH can integrate and utilize the capabilities of number of reporting tools like Oracle BI Publisher, Oracle Hyperion Financial Reporting Studio, Fusion Edition and Oracle Business Transactional Business Intelligence. Drill down up to journal and transaction level is possible from any of the reports generated out of FAH which enables strong audit trail and control.

To record the transactions from Oracle/Non-Oracle applications and convert it into accounting entries, FAH offers a highly configurable rule transformation engine that generates the accounting entries, validates and transforms it into accounting journals. The rules can be configured based on the attributes of the transaction. For example Tax Accounts could be determined based on the country of transaction.

Different rules can be configured for different divisions complying with the local accounting regulations. FAH also captures additional attributes of the transaction for reference and reporting purposes.

FAH_Fin and Acc SS1.JPG

Some of the salient features of Fusion reporting are:

-          Fixed format reports and local statutory reports can be created using Oracle BI Publisher and Financial Reporting Studio/Workspace.

-          Interactive financial reporting and drill down can be achieved either by Oracle BI Publisher, Oracle transactional Business Intelligence, Account Monitor and Account inspector

-          Seeded BI reports are delivered in OTBI

-          Live and interactive reporting with multiple output options like HTML, Excel, PDF; available in Smart View

-          Drill down of the report information upto detailed journal and sub-ledger transaction level using Account Inspector

-          Bursting option available with BI Publisher to split the report and email or print them as per output option selected

-          Functionality to embed charts and graphs in the report

-          Drag and Drop report grid and dimensions

FAH also offers features for Budgeting and Budget Reporting, Consolidation and Inter-company processing. The advantage of FAH for an Account and Finance Shared Service specifically lies in its fitness to be integrated with diverse sub-ledger applications and providing robust accounting platform along with impressive reporting and analytics solution.

Continue reading " Accounting and Finance Shared Services- Oracle Fusion Capabilities for Record-to-Report and Analytics " »

January 14, 2015

Hyperion Data Integration Tools - A Comparison

Continue reading " Hyperion Data Integration Tools - A Comparison " »

January 13, 2015

The new improved Multi-Load functionality in FDMEE

Continue reading " The new improved Multi-Load functionality in FDMEE " »

December 3, 2014

FDMEE - The future of Hyperion Integrations

Part 2

Mappings Redefined

Co Author : Payal Kapoor(

In previous part we discussed the recently launched tool FDMEE (Financial Data Quality Management, Enterprise Edition) that was introduced in the EPM release,

In this part we are going to cover some great mapping features introduced in FDMEE that were not available in its predecessor FDM. 



Continue reading " FDMEE - The future of Hyperion Integrations " »

September 22, 2014

Capabilities of Siebel Mobile Disconnected Mode for Field Service

By Yusuf Kaydawala, CRM Consultant, Manufacturing Unit, Infosys


Out of the many exciting features in the latest Siebel Innovation Pack, the most interesting feature for me is the Siebel mobile disconnected mode. Siebel mobile disconnected application allows users to access the CRM application on the browser in their tablets and smartphone even when the user is not connected to the Siebel application. Once the connectivity with the Siebel server is restored, the application provides capability to synchronize the changes back to server. This application is ideal for a travelling sales or field service technician who have access to the network for most of their time.

Disconnected mode functionality should work on all the latest mobile devices available in the market that are HTML 5-compliant. User can engage the disconnected mode functionality on Chrome or Safari browser with at least 50 MB of available storage on the mobile device.

I have mapped various tasks performed by a field technician with the capabilities of Siebel Mobile Disconnected mode:

Field Engineer's Task

Task Details

Siebel Mobile Disconnected Capabilities

Miscellaneous Tasks

Logging in the mobile application


Logging off in the mobile application

Not supported

Viewing broadcast message

Not supported

Creating bookmark

Not supported

Invoking Menu

Not supported

Back button on UI

Not supported

Using Calendar features

Not supported

Exchanging data with external applications

Not supported

Create Orders


Preparing for a site visit

Creating Siebel Messages


Viewing Service request

Supported for synced Service requests

Viewing Account details like Service History, Assets, Entitlements and Contacts

Supported for synced Accounts

Viewing My Activities


Creating Orders


Running queries


Viewing Assets

Supported for synced Assets

Viewing Directions/Maps

Not supported

Checking required parts


Checking steps and instructions for the job


Checking Own Inventory position


Running PDQs to find other Activities

Not supported

Performing service

Changing activity statuses


Capturing time tracker


Capturing Part tracker


Committing Part tracker


Capturing expenses


Creating new Service Request


Creating new Activities


Adding Attachment to Service Request

Not supported

Viewing All Parts in the application

Supported for synced parts

Closing a service call

Closing Activity


Invoking Business rules / Validation on Activity closure

Partially Supported. DVM/WFs/Server scripting validations will not work

One of the key considerations for the Disconnected Application is that only 50 MB of offline storage is possible. To overcome the storage limitation, the business admins needs to do a careful analysis of the data and setup data filters to determine the data that needs to be downloaded on the mobile device.

Siebel mobile provides the ability to configure filters on various business entities to only download relevant data for offline usage like:

  • Accounts of only certain country
  • Activities in certain statuses
  • Products which can only be used as spare parts

Field service is one of the toughest jobs in the industry with field engineers servicing complicated equipment, often under unrealistic deadlines and customer expectations. The Siebel Mobile Disconnected application empowers the field engineer by enabling them to perform their daily tasks when they are on the road even without network connectivity.

July 20, 2014

Mighty Oracle Fusion Transportation Intelligence

Retaining customer loyalty, meeting SLA's, managing increasing expectation of stakeholders within existing financial restraints are few of the challenges business face. Logistics hence, plays important role meeting such needs and provides competitive advantage. Transactional logistics data nowadays is not merely part of repository; it has grown as integral part of business strategy in today's world. If harnessed it can provide the competitive edge businesses around the world eye upon. With ever growing data storage capacity per unit cost, Businesses around the world have huge amount of such data. This data if utilized in strategic manner, can provide Intelligence support to day to day business decision or objective support to future strategies.

Oracle Fusion Transportation Intelligence is one such innovative product which meets such business needs. It's a web based solution which integrates Oracle Products such as Oracle Database , Oracle Data Integrator, Oracle Business Intelligence Enterprise Edition & Oracle Transportation Management (OTM).Transactional data residing in OLTP systems (OTM) is fetched into FTI historical repository.The FTI's historical repository act as a datawarehouse & provides information which can be utilized to support various level of decision makings. Organizations need to define and monitor various Key Performance Indicator (KPI's) or Metrics , keeping in mind the strategy , to analyze their business health.


Fusion Trasnportation Intelligence Features :-

·         Industry Specific Dashboards

·         Over 60 Prebuilt KPI's post OTM 6.0

·         Increased Adhoc Querying Capability

·         Interactive Alerts

·         User Defined Customizable Dashboard

·         KPI Targets

·         Performance Indicators

·         Customized Dashboards based on user role

·         Historical Analysis available for Decision Points

·         Process Management- ETL Jobs from OTM UI

·         Data Management- Data Load Status in FTI in Dashboard

·         Analysis At Week Level - Lower Level Of Granularity

·         Refnums Modeling in FTI

·         Ranking of Refnums - 3 Refnums per Object flow into FTI 

User Friendliness :     

  • Put reports in placeholders and if needed share it with other users by granting access
  • Create on the fly reports by simple drag & drop required columns
  • Ad Hoc Querying - Create business reports on ad-hoc requirements

Business Benefits:

·         Operations Benchmarking & Tracking

·         Defining Business Metrics

·         Business analysis spread across various periods

·         Various report available on-demand , to assist in decision making

·         Revise & keep track of business goals.

·         Get alerts based on business events.

·         Target Setting for future

·         Tune business processes


Fusion Transportation Intelligence Architecture Details :

FTI works with Historic Data and fetches data from Oracle OLTP servers. In practice , keeping in view the performance the data source is generally kept as ROD ( Replication of Operational Database) or DR (Disaster Recovery). Oracle Data Integrator fetches up the data from these repository servers. Diagram below depicts Oracle Data Integrator architecture :



In practice , we trigger ETL process ( Extract , Trasform & Load ) from OTM UI. Once triggered , ODI fetches the data from source repository , transforms the data in to denormalized form ( extended star schema ) and finally , pushes data into Historical Database or commonly known as FTI DB. OBIEE ( Oracle Business Intelligence Enterprise Edition ) uses Historical Database for reporting purpose.Finally , OBIEE is linked with OTM/GTM products to access FTI from single UI window.


 Diagram below summarizes the process flow





Infosys Advantage :

Infosys brings it's vast experience in Consulting & System Integration. Apart from mentioned production features and it's implementation , Infosys brings tailored customizations which best suits business needs. Certain features that our expertise brings which doesn't come out of box :

ü  What-If Scenarios

ü  Developing Forecasting Capabilities

ü  Predictive Transportation Analytics & Simulations based on historical data

ü  Data Modelling & Wrangling

ü  Vouchers & Accrual Process Analytics



Fusion Transportation Intelligence brings best of the Oracle Products and it's integrated architecture provides flexibility which can be mould to suit existing business best practices.It's highly recommended for the business who looks forward to make best out of existing logistics data.

December 27, 2012

Relevance of Data Governance in Big Data

Enterprises always had more data than their infrastructure and processes could handle, even in past let aside the thought of deriving actionable intelligence for effective decisioning from them. The same problem has now intensified several folds with larger volumes, variety of data from social media, online channels & most critical is the "Urgency" to handle such data.

Continue reading " Relevance of Data Governance in Big Data " »

September 25, 2012

Oracle E-Business suite on Exadata - A Reality

Guest post by
Vinayak Gangadhar Kurdekar, Technology Lead, Infosys


Exadata when it was launched by Oracle a couple of years back initially created a sensation in the market as a hardware platform engineered by Oracle in collaboration with HP for hosting large databases. Later as time progressed and technology matured, Exadata became synonymous as a machine that provided extreme performance and scalability for both OLTP and datawarehouse applications.  Both Oracle developer and DBA community had the impression that Exadata has been mainly developed for hosting databases with large amount of data typically seen in a datawarehouse environment and were not aware if oracle's E-Business suite of applications can be hosted on Exadata. The answer is yes; currently Oracle E-business suite release 11i with 11gR2 database is certified with Oracle Exadata V2 machine. Oracle is currently working on the certification of the E-Business suite release 12 on exadata and the same should be available in the near future.

Continue reading " Oracle E-Business suite on Exadata - A Reality " »

September 17, 2012

Oracle Fusion Procurement - Empowering Procurement Function (Part -1)

Guest post by
Sirish Newlay, Consultant, Infosys


Purchasing department in an enterprise has taken a center stage in the recent years with more and more enterprises investing hugely to make sure they buy the best, at the best and from the best it becomes immensely important for buyers to make sure they are abreast with latest trend in the procurement world.

Also the race to remain on top has made a great impact in way procurement business functions align with other business functions in an enterprise today and in future for example looking ahead in time with expansion of procurement scope we see importance of rethinking talent management. From financial perspective we see a high dependency on costing methodologies.
In this blog, I provide an overview of "Oracle Fusion Procurement" application and briefly discuss the recent procurement industry trends and in the Part-2, I will provide our point of view regarding how Oracle Fusion Procurement and its capabilities cater to the recent procurement trends.

Continue reading " Oracle Fusion Procurement - Empowering Procurement Function (Part -1) " »

July 3, 2012

CRM to CXM to C3E - Innovation or Evolution?

While CXM (Customer Experience Management) was floated as a concept almost a couple of years ago, it seems quite recent in wake of the fact that CRM (Customer Relationship Management) has been the mainstay of enterprises worldwide for almost 2 decades now. Despite the fact that most of the organizations are yet to latch on to this, and the fact that the ones who have latched on are yet to mature the concept when implemented on ground, the industry is looking for something newer already. The new acronym on the block is C3E (Cross-Channel Customer Experience), and at least for now it seems to satisfy everyone's appetite for innovation.

Continue reading " CRM to CXM to C3E - Innovation or Evolution? " »

November 28, 2011

Exalytics - BI and Big Data Tuned to Perfection

Analytics as we know is all about gaining insights from data to aid effective decision making. However, the vision of delivering fast, interactive, insightful analytics has eluded most of large Enterprises.

Continue reading " Exalytics - BI and Big Data Tuned to Perfection " »

November 18, 2011

Bridging Two Islands: A curious case of Oracle AIA

Guest post by
Vinod Kumar, Consultant, Infosys


Telecom sector has been chronically plagued by revenue leakage problems and often seen to be spending the maximum energy and budget on areas around how to not let it happen or in other words having in place an effective Revenue Assurance mechanism. Making it one of the most published area in the industry are the solutions around it yet the clients, be it the newcomers or the old-timers, are often marred by this problem. Not that there are no solution around it but it's the understanding of the solution/product offering and the judicious implementation which makes a solution stand out.

Continue reading " Bridging Two Islands: A curious case of Oracle AIA " »

November 4, 2011

Why Oracle Fusion Procurement? How does it add value?

Guest post by
Suchitha Prabakaran, Consultant, Infosys


In the current global business environment, efficient Procurement techniques, methodologies and processes play a vital role in providing a competitive edge- both operationally and financially for any organization irrespective of the size, nature of business, products dealt with, area of operations etc. Corporates now have realized the importance of streamlining and strategizing procurement processes as it makes a huge difference to the revenue and profitability of the organization.

Continue reading " Why Oracle Fusion Procurement? How does it add value? " »

October 4, 2011

Leveraging Fusion Business Process Model Methodology in Project Portfolio Management (PPM)

Guest post by
Sandeep Suresh Deshpande, Principal Consultant, Infosys


In today's world, business is exposed to a constantly changing environment where organizations face the most challenging situations -

  • Information silos across the value chain
  • Parallel processing amongst distributed systems
  • Mismatched products to support complex end-to-end business processes
  • Complex and long-running projects
  • Real-time project consolidation and budgeting

Oracle has done an extensive internal and external research, took the best practices from Oracle E-Business Suite, Oracle PeopleSoft, Oracle JD Edwards, Oracle Siebel and all other acquisitions to come up with a number of industry business process areas with multiple levels of decomposition. These world class business processes can be leveraged to deliver in the complex and challenging environment.

Continue reading " Leveraging Fusion Business Process Model Methodology in Project Portfolio Management (PPM) " »

September 27, 2011

Effective Sales Planning with Oracle Fusion Sales Planning Solution

Guest post by
Sushal Subashchandra Shetty, Consultant, Infosys


Are you not able to define effective territories for your sales organization?

Is your sales team swimming in the sea without targets?

Is your sales team cherry picking on prospects and you are not aware how to tackle it?

Are you losing track of your sales team's performance?

If answer to all/one of the above questions is yes, then my cent that you ought to read this further. The top problems which one of the large manufacturing company was facing were loss of productivity due to redundant and manual process, lack of consistent and efficient reporting, inaccurate commissioning.

If you are also facing the same problems as above then sales planning solution based on Oracle Fusion Applications is the one-stop solution for you.

Continue reading " Effective Sales Planning with Oracle Fusion Sales Planning Solution " »

September 7, 2011


Guest post by
Vinayak Gangadhar Kurdekar, Technology Lead, Infosys


Businesses today need to increasingly leverage a unified platform to host all their database and applications at one place. To meet this challenge, Oracle has designed the Exadata server technology that bundles database and applications together to provide high performance, high availability, scalability and throughput. Mere implementation of the exadata server and its components is not enough. Disasters can strike at anytime. To ensure survivability and safeguard against any major disasters, effective backup and recovery solution is a must. The backup and recovery strategy needs to be not only precise but proven in all aspects. To achieve this Oracle has come out with multiple backup and recovery solutions designed in-house and third party.

  1. Disk based backup and recovery.
  2. Tape based backup and recovery.
  3. Backup and recovery using dataguard.

Continue reading " BACKUP and RECOVERY in EXADATA " »

July 19, 2011

Centralized Procurement with Oracle Fusion


With the business expanding across globes and with manufacturing organizations resorting to sub-contracting, there is a need for organizations to look at procurement beyond their current markets.  This introduces the need for a centralized procurement function for better purchasing efficiency, better control over organization spend, and central and simpler management of contracts with suppliers.  Oracle Fusion Procurement introduces new and better features that aid organizations to better manage their procurement functions.  The following are the advantages of a centralized procurement function:

  • Better control over organization spend
  • Consolidated purchasing across business units
  • Leverage volume discounts by consolidated demand
  • Better supplier relationship management
  • Single point of contact in buying organization for supplier
  • Centralized contracts - easier implementation and better management
  • Consolidated measurement of supplier performance
  • Reduced overheads

Continue reading " Centralized Procurement with Oracle Fusion " »

May 11, 2011

Oracle Exadata - A platform for consolidation (Part-1)

Guest post by
Umesh Tanna, Senior Technology Architect, Oracle Practice, Enterprise Solutions, Infosys Technologies Ltd.


Consolidation is one of the many key reasons for deploying and using Exadata. Oracle Exadata is an appliance - hardware and software bundled and engineered to provide maximum performance. It is a database machine that is purpose built and integrated. True, that there are high end systems that are best suited to be replaced by Exadata platform considering the growth and performance need of those system. However, by design, Exadata offers high end configuration and specification which not all types of Oracle database application that today runs in IT department need.  Hence, solid business case can be built for those systems if, some type of consolidation solution is considered which can simplify, increase availability and performance and ultimately reduce total cost of ownership.

Continue reading " Oracle Exadata - A platform for consolidation (Part-1) " »

March 15, 2011

How to measure ROI for SOA projects? A Tollway Approach

Guest post by
Prasad Jayakumar, Technical Lead, Oracle Practice, Enterprise Solutions, Infosys Technologies Ltd.


Early morning I was hitting I-94W to meet my client in Milwaukee.  I wish I had my I-PASS; toll varied from 25 cents to little more than a dollar.  I was wondering "Why not the government builds the infrastructure out of tax collected and let us free?"  Immediately I realized my ignorance.  My discussion topic with client was "How to measure ROI for SOA Projects?"  It was a meeting to see if we can present benefit of SOA in a tangible way to business.  I was wondering if the toll way could answer the question.

Continue reading " How to measure ROI for SOA projects? A Tollway Approach " »

March 10, 2011

Hybrid Cloud and Weblogic WAN Clustering

I was reading up on clouds and their drawbacks and one issue that kept repeating was customers loosing control of their applications and data to the cloud service providers. One of the key themes of Cloud computing is to scale the computing infrastructure on the go as well as pay for only what you use. So i was wondering if it is possible for customers to use the cloud only for scaling their existing computing infrastructure without moving their entire application to the cloud. As i read more on this problem, I stumbled on a new concept (new for me :-)) called the "Hybrid cloud".

Continue reading " Hybrid Cloud and Weblogic WAN Clustering " »

February 22, 2011

Oracle SOA 11g - One Less, One More

Guest post by
Prasad Jayakumar, Technical Lead, Oracle Practice, Enterprise Solutions, Infosys Technologies Ltd


One Less, One More™ -  "one less negative thought or action, and one more positive thought or action, and soon our lives will be less stressful, and if practiced as a community [...]  would move in the direction of our collective goals, dreams and desires" by Robbie Vorhaus,  Communications, crisis and reputation advisor.

Oracle acquired many best-of-breed products on SOA landscape may be for architecture excellence or for concept/idea or for market share or to fill the void or based on some other strategy which only acquisition experts know.  But, what's interesting is how Oracle managed to unify all the acquired products and paved a path to Success as a corporate and for Oracle SOA practitioners.

Continue reading " Oracle SOA 11g - One Less, One More " »

February 18, 2011

Do we need SOA Governance ?

Guest post by
Prasad Jayakumar, Technical Lead, Oracle Practice, Enterprise Solutions, Infosys Technologies Ltd.


In my recent client discussion the ever pondering question popped up, "Do we need SOA Governance?" Definite YES, I told without second thought. If they would have asked me "Do we need SOA?" the answer would have been based on whether business is ok to WALK or wants to DRIVE.  Since the question was about governance, it is as simple as saying "If you want to drive safely, better hold a valid driving license."

Continue reading " Do we need SOA Governance ? " »

January 31, 2011

Starting a SOA Project? Don't forget the Service Registry!!

In the SOA world, there are several pieces of the architecture which are considered de-facto for an SOA. e.g. Service Bus, BPEL process Manager, Business Activity Monitoring etc. But a key component which is only added as an after-thought in most implementations is the Service Registry.

Continue reading " Starting a SOA Project? Don't forget the Service Registry!! " »

September 21, 2010

Is Fusion CRM game for it?

The success rates of CRM implementation have been reported low traditionally. Various studies attribute the reason to various factors like user adoptability, system adaptability, high TCO etc. Should I take the liberty of saying that marketing and sales users (next to warehouse users) would be ranked first in discarding a system which is not easy to use and fit the requirements. So, Is Fusion ready for sales and marketing users? Let us have a look at the features which marketing and sales users would go gala about it.

Continue reading " Is Fusion CRM game for it? " »

September 16, 2010

Oracle's promising BI Roadmap - Taking the joy ride together (Part 1) !!