Infosys’ blog on industry solutions, trends, business process transformation and global implementation in Oracle.

« December 2017 | Main | February 2018 »

January 31, 2018

Configuring Task Flow dependency and Teradata ODBC connection in Informatica cloud

In this blog, I am going to cover Configuring dependency between the two task flows and configuring ODBC connection for Teradata relational Database in Informatica cloud.

This blog will help to run one task flow when the dependent task flow completed successfully and create an ODBC connection to connect Teradata Relational Database which can be used in Mappings/Data Synchronization/Data Replication jobs to load/retrieve the data.

Configuring Task Flow dependency

In real time, we would require the taskflows to run in certain order or in other words, we should trigger a task flow (taskflow2) after the completion of a task flow (taskflow2). This can be achieved by runajobcli package. This package helps to triggering a taskflow from command prompt (API call). 

Steps to configure the package and trigger the task flow


Step 1: Ask Informatica cloud support to activate the Runajob package on the secure agent or ORG if it is not already available.

Step 2: Configure file present in below path

Path: C:\Program Files\Informatica Cloud Secure Agent\downloads\package-runAJobCli.2\package\


Step 3: Give full control access to the Informatica cloud installed folder to the user (secure agent installed user).

Step 4: Try to call one dummy task from command prompt by going to the runajob package path.

Ex: C:\Program Files\Informatica Cloud Secure Agent\downloads\package-runAJobCli.2\package\

    cli.bat runAJobCli -n W_ETL_RUN_D_AMDOCS_End_MCT -t MTT

Step 5: If you get 'java internal external command not found', then add java path environment variable in 'My computer'->Advanced properties

EX: Add- C:\Program Files\Informatica Cloud Secure Agent\jre\bin(C or D directory)  

Step 6: Then try to call the same command from post processing command of the MCT. If you get the java error in the log, then add full java path to CLI.bat file

Example CLI.bat: @echo off

pushd %~dp0"Informatica Cloud Secure Agent\jre\bin\java.exe" -cp .;runAJobCli.jar;lib\* com.informatica.saas.utilities.plugins.RunAJobPlugin %* popd @echo on


@echo off

pushd %~dp0"D:\Apps\Informatica Cloud Secure Agent\jre\bin\java.exe" -cp .;runAJob


Configuring Teradata ODBC Connection

To establish Teradata ODBC connection in Informatica cloud you would require below utilities which can downloaded from below URL

Once you have downloaded follow the below steps

Step 1: Run the installer (.exe) from the Downloaded directory and Choose the Language and click on 'Next'.

Step 2: Installation Wizard will be opened. Click on 'Next'.


Step 3: Accept the License Agreement by clicking 'I accept the terms in the license agreement' and click on 'Next'.

Step 4: After selecting destination folder , Click on 'Next'.

Step 5: Do check 'ODBC Driver for Teradata *' and Click on 'Install'.

Step 6: Click on 'Finish'.

Step 7: Run the installed one more application file from the following path :


Step 8: Click on 'Next'.

Step 9: Accept the License Agreement by clicking 'I accept the terms in the license agreement' and click on 'Next'.

Step 10: Click on 'Finish'.

Step 11: Run one more installed application file from the following path

EX: D:/TeradataODBC_windows_indep.\TeraGSS-64bit


Click on 'Next'.

Step 12: Select 'Modify' and click on 'Next'.



Step 13: Select the required feature and click on 'Next'.

Step 14: Click on 'Install' to start installation.

Step 15: Under System DSN, set the ODBC Driver Setup for Teradata Database by providing user Name and PWD And click on 'OK'.

Step 16: Test the ODBC connection in command prompt.

Step 17: Using the ODBC connection you've created, you can create a 'New connection' in Informatica Cloud.

Step 18: If test connection fail, restart the secure agent and try again.

January 30, 2018

AWS Offerings for Dummies

What is Cloud Computing:

Cloud Computing is the on-demand delivery of IT resources and applications via the Internet with pay-as-you-go pricing.

Some of the popular Vendors for Cloud computing are listed below

Amazon: AWS, Microsoft: Azure, Oracle: Oracle Cloud, IBM: IBM Cloud


What is Amazon Web Services-AWS :

·         a) Provides On-Demand Cloud Computing Platforms.

·         b) Services provided towards Big corporations, SMBs and public services agencies etc

Why AWS :

The below table quickly describes on need for moving toward cloud.



Data Integration in Einstein Analytics

This Blog explains various ways to integrate and load data into Einstein Analytics

Data Integration is one of key aspects of BI Tool, Einstein Analytics does exceedingly well in this department with seamless integration, as it doesn't require data to in particular format (like star or snowflakes schema).

Unlike the other BI tools, Einstein Analytics stores the data itself in the cloud not just the metadata. Hence, we need to refresh data time to time. In order to ensure we are working or using the latest data. Storing the data along with inverted index (the way data is stored in Einstein Analytics) boost the performance of the tool.

The options available to load data into Einstein Analytics are

  1. Salesforce: The salesforce objects can be directly loaded into Einstein Analytics using dataset builder or dataflow.
  2. CSV: Csv files can be directly uploaded into Einstein Analytics.
  3. Informatica Rev: The data from external data source that can be loaded into Einstein Analytics using Informatica Rev.
  4. ETL Connector: ETL tools like Informatica cloud, mulesoft, Boomi, snaplogic, etc have connector for Einstein analytics using which you can load data into Einstein analytics.

Screenshot taken from Einstein Org

The data integration can be done at

Dataflow- Dataflow view in Einstein Analytics has etl transformation like Extract, Augment(Join), Append(Union), Slicer, etc., One can leverage this option to integrate data coming from different data sources. Dataflow is also used for data refresh in Einstein Analytics.

Using ETL Connectors- (Eg: Informatica Cloud, the data can be integrated at informatica cloud and integrated data can be loaded into Einstein analytics using "salesforce analytics" connectors.

Apart from this you can also establish connection between datasets at dashboard level using bindings, connect data sources and SAQL mode.
  • Binding: The changes in one step/widget triggers change in other step/widget in a dashboard, this is achieved using binding if the step(s)/widget(s) are created using 2 different datasets.
  • Connect Data Source: This option is available at dashboard level, using which you can connected two columns from two different datasets.
  •        SAQL:  You can write a SAQL query to fetch data from two different datasets at dashboard level.

January 29, 2018

Excel to App - a journey beckoning finance world to the future of technology? A view from the Oracle Analytics Cloud Consultants' Lenses

Around last thanksgiving there were two interesting articles in Wall Street Journal both concerning the finance world, one mentioned the diktat given out by CFOs across companies to get rid of the spreadsheet-based finance analysis and reporting within the finance organizations, and the other from the finance analysts and planners rebuking that the Microsoft Excel continues to be their one powerful tool for analysis and reporting of financial numbers.

For an Oracle BI-EPM consultant like me implementing Essbase, planning and other technology solutions predominantly for Finance organizations of my clients both articles throw an interesting perspective on the transition that is imminent, from a conservative macro based analysis to the futuristic app based analysis, and I wanted to write about the possibilities that organizations can draw a balancing line with Oracle's latest offering in the cloud to ensure none of the goodies of spreadsheets are lost and at the same time an organization can plan to have a nimble, and efficient solution that can bring value to the CFOs table.

In the first article "Stop Using Excel, Finance Chiefs Tell Staff.", Adobe Inc.'s finance chief Mark Garrett says that it takes days for his team to put together the job fulfilment report across the organization and to get the numbers to analyze the impact of salary on budgeting. Garrett wants his finance staff to focus more on the analysis than spending time on getting that data from disparate systems into the spreadsheets.

As a direct opposite to this view, the second article carried the view of Finance professionals who say "You'll Have to Pry Excel Out of Their Cold, Dead Hands". This shows the comfort level of maintaining and analyzing huge amounts of data for most of the professionals in finance as well as analyzing the data for insights following cumbersome ways without utilizing the features that modern technology solutions provides. The article opines that finance analysts spend a good amount of time trying to develop a Point of View (PoV) from the spreadsheet and carry a sense of pride around it and take ownership of the analysis.

There is technology supporting Spreadsheet

In FP&A systems specifically Oracle Essbase has gained so much user base due to its ability to combine the best of the worlds in terms of executing complex calculations using hierarchical data model and be able to showcase specific Points of View (PoVs) using Excel Add-in - SmartView. This tool has so much flexibility in the hands of the end users that it enables them to actually not worry about data or its integrity and actually gives ample flexibility for end users to perform business analyses needed for their organizations.

Excel is my favorite 

Despite the fact that a small formula error could lead to large differences in the value reported through spreadsheets, excel remains the favorite choice of analysts. One of the reasons that I think the reason behind this strong conviction is that all the business logic applied and all the calculations are visible to the analysts right in front and they do not get any surprises to say that logic was applied but the value came out wrong. Whereas, in a system there is a set standard as far as data entry, business logic configuration, and expected output. If junk is fed to the system, junk comes out. So, this encapsulation in a system in terms of backend data processing becomes a bigger hindrance in embracing such a system by financial analysts.

Oracle Analytics Cloud (OAC)

Essbase in Cloud is perfectly suited to give best of the two worlds in terms of calculation performance and flexibility to end users as well as be able to define business calculations and dimensionality by the actual end users / consultants. It gives power in the hands of the right people. It does not stop at that, OAC perfectly aligns to the technological future state that top level executives across organizations are looking forward to, a tech savvy futuristic state where finance data is available along with insights and commentary at the "Hi" of a voice command and seamlessly across different types of corporate and personal devices for the financial analysts' of the millennial generation.

Transformation leads to Transition

Oracle Analytics Cloud (OAC) in my opinion will emerge as a fusion arena where it brings the solid features of being a PaaS offering combined with the SaaS features in the FP&A domain. In relation to the existing on cloud applications for financial planning and consolidation (EPBCS & FCCS), OAC comes up with a good packaging option along with PaaS advantages of performance, scalability, and to serve not only the finance community but also the ability to incorporate the needs of corporate reporting across the organization through its DVCS, Data Lake, and BI packages combined with power punching, number crunching Essbase.

What's ahead?

As transformation progresses from on premise to cloud, demand for app based, real time insights with commentary can be enabled through Oracle Mobile applications such as Synopsis, Oracle Mobile HD, and the uber-cool Oracle Day-by-Day is sure to transform the consumption of data from spreadsheets to insights on-the-go through futuristic apps in finance.

January 23, 2018

Oracle Cloud Manufacturing integration with Project Costing.

Oracle Cloud is buzzing now days in ERP market. Most of the organization are looking for cloud transformation. However as the Oracle Cloud is evolving, still there are few functions, integrations are missing and not yet released with latest Oracle cloud R13 release. One of the major Project Portfolio Management sub module missing in Oracle Cloud R13 is Project Manufacturing.

As in Oracle EBS, Project Manufacturing play acute role to integrate Project Manufacturing process with Project Costing. Project manufacturing help to account and integrate Project related manufacturing cost to project costing, to gives complete picture of project cost across the sub-ledger which help for better decision making.

In most of the project based manufacturing organization, Project manufacturing is key sub module without it not possible to complete project manufacturing business process cycle. As Project manufacturing is not yet released by Oracle Cloud, so it's creating gap between manufacturing and project costing process. There is no integration to import project related manufacturing cost from SCM cloud to cloud project costing. Which is becoming an obstacle for project manufacturing organization to move on Oracle Cloud.

However there are some workaround available to bridge the gap between manufacturing and project costing process. We are going to explore below:


1.    Project Transaction Type:

Oracle Cloud allow to create Misc Issue to Project and Misc Receipt from Project transaction. This are standard process provided in Oracle Cloud even as Project Manufacturing is not available.

If it need to transfer a cost on Inventory item purchase to Project then can follow below process.

1.     Create Inventory PO

2.     Receive a PO

3.     Create Misc issue to transfer inventory PO cost to Project

4.     Run Cost accounting

5.     Transfer Cost to Project

Similarly you can perform Misc issue to Project to transfer cost to project for other business process.

We can use this workaround in scenario, where simple project manufacturing business process and less number of project manufacturing transactions get generated.


2.   File Based Data Import (FBDI) :

In case of complex project manufacturing business process where the large number of project related manufacturing transaction gets generated, in such scenario it's tedious to keep track and perform manual transaction as "Misc Issue to Project" to transfer cost to Project. In this case FBDI is alternate option.

FBDI template feature helps to import bulk data from third-party system or other Oracle ERP applications or create new transaction data in Oracle Cloud. The FBDI template# Import Project Inventory Costs Imports inventory cost transactions from third-party applications or creates new project inventory cost transactions.

There can be following process to upload the manufacturing data using FBDI Template.

1.     Explore DFF to collect project details for project related manufacturing transaction.

2.     Create an OTBI Report to extract all the eligible project related costed manufacturing transaction, which are not already imported to project. Here Original Transaction Reference field form FBDI template can utilized. It can link with inventory transaction id to avoid duplication of records.

3.     Upload the project manufacturing transaction to FBDI template# Import Project Inventory Costs

4.     Import the FBDI using following scheduled process.

a.     Load Interface File for Import

b.     Import Costs



This solution can be automate using Web Services.


            Oracle cloud SCM to Project costing integration provide the more clarity of project manufacturing cost. It help to bring project manufacturing cost, which gives the complete picture of project cost which helps to better cost tracking and management.


January 22, 2018

Vroom Vroom... with the Infosys Automotive Solution

Automotive Industry has been largely ahead of the innovation curve bringing in more technology to the vehicle towards the needs of the market. But all this while, they were challenged working with their own archaic systems. Good customer experience does not just mean good client facing applications but also the entire supply chain has to be customer oriented. Each of the supply chain elements need to be integrated to the get the part/vehicle at the right place at the right time.

Writing in fear of being cliché, an Automotive supply chain has its own complexities which sometimes are not as intuitive to anyone who does not live and breathe this industry. This is where Infosys Automotive Solution has been crafted and perfected over the years, to cater to such specific supply chain challenges.

1.       Supersessions: This is where the rubber hits the road. Almost every leading ERP product in the market has functionality to define supersessions but is it integrated to the entire process?? The answer will be "No".  The complexity does not end with ensuring we are selling always the oldest part of the supply chain but we are also buying the latest part in the chain. Ensuring the End of life and forecasting processes for the product chain are tied together. Even from a pricing perspective, how is the solution going to align the prices along the chain or create incentives for driving buying behavior from dealers?

2.       Referrals: Referral is a concept beyond Promising. How does one ensure we refer to the next nearest warehouse to meet the demand to ensure customer experience does not take a hit? While doing this, how do we keep the logistics cost minimal? How do we ensure we follow the milk run routes or do rate shopping real time? How do we ensure routes are combined together? While traditionally these problems are solved through transport integrations but many have solved this problem too much downstream.

3.       Fair share: When we are in a back order situation in the entire network and there are continuing supply constraints, how do we ensure that the incoming supplies are transferred and is fair shared across all distribution centers. Should it be based on FIFO, or customer priority etc.? These are problems that applications have continued to ignore putting these as execution problems.

4.       Slotting: Warehouse space is real state, how do we ensure that the fastest goods are always picked fastest. Also will the fastest always remain fastest? Or will there be seasonality, trends which we have to cater to. Slotting is ensuring that a continuous proactive process.

5.       Dealer incentives: This is an important part of the supply chain, often ignored. Supply chains are like humans, unless we build in incentives, we won't be able to drive the required behavior from the supply chain constituents.  The big question would be what should we stock at dealer inventories which are client facing and what we stock at middle level warehouses vs central warehouses. At the end of the day, inventory budget and customer service levels will drive the decisions but a dealer would only be concerned about their own profitability.

While we covered some of the nuances of the automotive spare part supply chain, there are many more such niche challenges which are unique and have been built in Infosys Automotive Supply Chain Solution. The solution not only covers the spare part supply chain but also caters to vehicle business as well. Additionally, we have solution flavor catering to Tier1 suppliers as well. To know more, reach out to us at Oracle Modern Supply Chain event at San Jose.@ OracleMSCE @Infy from 29-31st January 2018.

January 19, 2018

PCMCS: What should you look at when you decide to leave the ground and fly with cloud?


(This is part 1 of 3 blog series to provide details around PCMCS (Profitability and Cost Management Cloud service) Application; available features and its similarities and differences from HPCM (Hyperion Profitability and Cost Management) on premise.)

In the era of increasing competition and reducing margins companies are looking at analyzing their cost and Profitability in detail to find the problem areas and resolve those. So this earlier super niche area of Profitability and cost management is seeing more client's footfalls as it moves to cloud. A profitability application answers many seemingly simple but vital questions for an organization, which would help it to grow much faster than competition, such as:

  1. What are cost bottleneck for your organization?
  2. Which Operation / products / departments etc. are more profitable or loss making and why?
  3. Help Organizations to understand details of its indirect cost and its impact. Which in turn would point to suggestive actions that could reduce cost and increase profit margins.
  4. What worked well for customers in prior periods and can that be extended to other regions / products / departments as well and so on.....

Great, so you have decided to get answers for above questions. Next comes another question should you go ahead with an on premise Hyperion profitability and cost management (HPCM) or a Profitability and cost management cloud solution (PCMCS).

This would require you to understand the areas of differences which is not just limited to license cost but compares available modules, functionalities, reporting, infrastructure, data integrations, Automations and many other areas.

In part 1 of this blog we would look at 2 major areas i.e. HPCM and PCMCS architecture:

  1. What is included and what's not?
  2. Modules that are part of HPCM and what's available in PCMCS?

HPCM Architecture:

PCMCS Architecture

Components that are part of HPCM and PCMCS License:



Essbase BSO (Standard profitability)

Essbase ASO

Essbase ASO (Standard profitability, Management Ledger)

Smart View

Relational Database (Detailed Profitability)

Hyperion Financial Reporting Studio

EAS Console

EPM Automate 

Performance Management Architect

Data Management

Smart View


Hyperion Financial Reporting  Studio


HPCM customer has an option to choose from 3 modules: Standard Profitability, Detailed Profitability and Management Ledger. 

Standard Profitability

  • To be used in case customers have highly complex allocation logic with data moving across many dimensions before being finally allocated.

  • Indirect costs are a big part of overall cost and needs many drivers to for final allocations.

  • This Module consist of a calculation Database(BSO) and a Reporting database(ASO). 

Detailed Profitability

  • For the cases where allocations are straightforward but data and metadata is very high. It provides a single step allocation for analyzing profitability.

  • Data is stored in RDBMS. 

Management ledger Profitability

  • Management Ledger module combines the features available in Standard and Detailed profitability providing a model that allows for high metadata granularity and medium complexity.

  • Data storage is in ESSBASE ASO

In PCMCS we have only one module available i.e. Management Ledger.

PCMCS Module

  • In current version of PCMCS only Management Ledger module has been provided. This allows for high data and metadata granularity as well as allocations via Rules and Rule Set

  • Management Ledger applications are designed for business users with deep knowledge of allocation process and limited scripting knowledge

January 17, 2018

Untangle spaghetti Model via Order Management Cloud

There are lot of manufacturing facilities, multiple retail, different finance and procurement centres in different countries, each of these units using myriad custom applications for Supply Chain and each application talks to every application. This is the (in)famous Spaghetti model where the logic on which applications must communicate is hard coded with in each application and this is logic is not configurable. If this sounds familiar, then please read on.

During inception, the organizations chose for one or few application that suite most of their need. But as the organization expands and with mergers and acquisitions, each organization brings its own home grown application. By the time the organization is mature in expansion into a conglomerate, the IT landscape is often a spaghetti of applications.Text Box:  
Picture 1 - Current IT landscape - Spaghetti model with point to point interfaces

The resolution to this situation comes in the form of Order Management Cloud (OMC). The functionality called 'Distributed Order Orchestration' in Order management cloud helps in end to end integration between order entry and fulfilment applications. Below are few key features of OMC.

Interfacing the sales orders: The orders are captured via multiple retain channels like in-store, call centre, ecommerce web site, by engineer during after sales service, mobile application, internal ordering between different entities of the business etc. But these orders can be routed to OMC and created as a Sales order by invoking the seeded order creation web service. The incoming order payload can have different fields populated by order entry system. But as long as mandatory values are present, a sales order can be easily created.

Enriching the sales orders: The SO, so created, may need to have different warehouses where the SO is fulfilled, different booking Business unit based on the geography of the customer, different product needs to be added to the SO based on incoming attributes, can have different shipment method or priority etc. Any transformation on the SO is possible via the pre, product and post transformation rules. To the delight of the IT team, these rules can be built on a visual builder making maintainability of these rules easy

Fulfilment activities made easy: These enriched sales orders are now ready for fulfilment via OMC itself or can be interfaced to different legacy applications for different tasks. For example, manufacturing activity can be fulfilled and interfaced to MES application while a pick and ship can be routed to a WMS application. The invoicing can happen a completely different finance application. All this is possible by configuring the external routing rules and web service connectors for these application. OMC will create a payload of the SO and publish it to these connectors, record the acknowledgement and also the fulfilment of the tasks in legacy applications

Provide complete visibility to customer: As a customer may be curious to know the details of his / her order, OMC can be configured to send a status back at specific intervals. For example, when SO is created in OMC, manufacturing is complete, SO is picked, SO is shipped etc. From IT point of view, this is (again, as you guessed) configurable. The web service connector can be configured for each of the order entry application and OMC will fire the status message to these connectors

Below diagram explains the order orchestration process flow


Picture 2 - How Order Orchestration works in Order Management Cloud

Varied business process: The business process may include progressing the sales order via a series of automated and manual steps. For example the SO will have to be automatically reserved, while the customer service team needs to check and update the SO with the customer before the item can be shipped out. Such different processes can be configured via order orchestration in OMC. The SO will be automatically reserved while it will wait for user inputs once the call to customer is made outside the system.

Changing customer needs: In this competitive world, being flexible to changing customer needs is paramount. But at the same time be cost effective. Order management cloud provides functionalities to control the customer change, cost each change and react to each of these changes in a different way suited for the business. The change order functionality can be easily leveraged

Picture 3 - Order orchestration via Order management Cloud

Gone are those days where IT application is just as transaction recording system. IT application is one of the main enabler and enhancer for each business. Order Management, being the revenue making and customer facing module, is truly more flexible to ensure that sales team can be more agile and proactive. So untangle the spaghetti model and route all orders to OMC and dive the fulfilment via simple transformation rules.

Order Management Cloud is implemented as the order routing application in an optical retail chain, operating globally, offering optician services, along with eyeglasses, contact lenses and hearing aids. There are 8000+ stores ordering items via 15+ retail applications and these orders are fulfilled via 10+ different specialised custom applications. With volumes of order line crossing 1 million a month, there is no room for error. While the implementation is still underway, benefits are reaped already by bringing all the routing logic centrally to Order Management cloud.


Sathya Narayanan.S

Lead Consultant

Infosys Limited

Blockchain in Supply chain & Logistics

In the past few years, lot of technologies such as Analytics, IOT, Digital, Cloud, Mobility, AI, Blockchain etc. have made life very interesting and we hear about them almost on a daily basis and also about the potential these technologies have. People believe that future belongs to these technologies and they have the power to completely change business dynamics worldwide. It is because of this that organizations, small or large, have started adopting these technologies and their investments have increased sharply in the past few years.

One technology which stands out is Blockchain. Why? It is because while others are still trying to make an impact, blockchain has already taken the lead by showcasing its power through bitcoins- blockchain's first successful product.

So what is Blockchain? Well, it is an unalterable ledger which is shared cryptographically and is used for the purpose of recording transactions' history. Its three pillars are-trust, accountability and transparency.

 What it is not? It is not centralized and thus, no single body or party owns it.

Beyond Bitcoin to Business Blockchain:

We all know that today supply chains are extremely complex in nature and involve multiple players at various stages. Various issues are faced by present day supply chains which can be in terms of lack of transparency, enormous amount of paperwork, lack of interoperability and very less knowledge about a product's lifecycle or its transportation history. All these loopholes severely impact the cost, speed and quality of delivered products.

We can save billions of dollars annually by reducing delays and keeping a check on frauds and according to a WTO report, worldwide GDP can increase by almost 5% and total trade volume by 15% if only we can reduce barriers within the international supply chain.

If we look at the businesses today, almost every business which has the producers, the finance, warehouse management, transportation, regulatory etc. use various ERP systems in order to maintain records.  People differ in their opinions about what is the current state and it leads to disagreements. Not just that, this is an expensive and a highly vulnerable method of doing business. Anybody can alter any record. Here, blockchain comes for help.


Some use cases of blockchain in supply chain and logistics:

blockchain use case.PNG

Dry aged beef is a good example. Customers have started demanding organic and local products which have clear origin. They not only want to eat fresh but also want to make sure that the product has its origins as claimed by the beef selling company. How can retailers deal with this?

They can provide product information through apps. Customers can scan the QR code through their smartphones and can themselves validate the authenticity of beef in terms of origin, quality etc. If any discrepancy is found, they can raise their voice. Historical data related to origin (such as feed or breeding), location of farm and of the beef in the entire supply chain, time taken in transport, expiry dates etc. can be easily accessed through a dedicated blockchain database. This is from the view point of customers.

Let's see how can blockchain help an organization. For example: Global retailer Walmart has started using blockchain to track sales of pork meat in China. The system helps it in determining the origin of meat, processing and storage related information, sell by date (expiration date etc.). Also, in case of product recalls, Walmart can understand which batch/batches were returned and who bought them. This way we end up with dynamic demand chains in place of rigid supply chains, resulting in more efficient resource use for all. Isn't interesting? 

Here's an interesting fact about logistics industry worldwide:

market scenario.PNGAbove data, on the one hand, clearly shows the vastness of shipping and logistics industry worldwide but at the same time it also shows the weaknesses of the industry and its inability to deal with such massive rise. In the years to come, logistics market is definitely going to grow but so will be the case with leakages in the system which will ultimately lead to losses. Freight brokers have lot of control over the industry at present. They decide the load, tag on a markup and sell it to carriers. This leads to increase in cost to carriers and ultimately a much higher cost to customers.

Blockchain can help in the sense that it can bring transparency and visibility by allowing shippers to be able to communicate important information such as geo-waypoints, loads and basic compliance information with carriers. Once a shipment is recorded in the blockchain, they can no longer be changed meaning nobody can alter the records or dispute validity of the transaction.

Maersk, the world's largest container-shipping company, is a good example for this. It is working in collaboration with IBM towards building an extremely robust and efficient blockchain network so as to make its logistics arm much more efficient and powerful and with lesser leakages in between. Even some time back, UPS announced the news of it joining BiTA (Blockchain in Transport Alliance). BiTA works in the area of developing blockchain standards for freight and logistics industry.

Blockchain can also be used to maintain history records of trucks. These records can be performance records, maintenance records etc. We all know that once a truck is out in market for sale, buyers wish to know the history of the vehicle. Blockchain with its "unbroken chain of trust" can help.

Drivers can have an important role in blockchain too. They can help in adding their own data, mostly automatic, such as on and off duty, conditions of roads, condition of vehicle and load etc. This can help them in dealing with any kind of disputes with either the shippers or the carriers in dealing with events such as when and where a particular accident happened, goods got damaged etc.

"Capacity monitoring" is yet another area where this can be useful. We all know that cargo volume is one of the main factors which determines the shipping freight charges. Here, blockchain, in association with IoT (Internet of Things), can help. IoT sensors can help in determining the amount of space used by a particular party and can forward this information to the blockchain network. This will help in enabling self-executing payments against the space used by the freight.

Factoring can also be dealt with using blockchain. With the help of smart contracts, blockchain can make factoring less necessary because it will make the complete system of payments towards transactions automatic.

Such cases clearly show that a blockchain can help all parties involved in a shipment to:

  • Reduce or eliminate frauds and errors
  • Improve inventory management
  • Minimize carrier costs
  • Reduce delays from paperwork
  • Reduce waste.
  • Identify issues faster

Keys to Integration successfully

We feel a three-step approach can be adopted to successfully incorporate blockchain into supply chains.

First, the process should begin with an internal blockchain setup which will give sufficient time to the organisation to get used to the technology while making sure that data is available and consistency is maintained throughout.

After this, the blockchain should be extended to other players such as 3PLs and direct suppliers which can help in a robust data management and data exchange.

At last, all the players along the supply chain, including end customers, should be integrated to the blockchain.

If utilized to its full potential, blockchain can improve the customer experience, can drive value end-to-end and eradicate inefficiencies so as to lower costs.

Blockchain Integration into the Supply Chain-A Three-Step Approach

Oval: 2End-to-End



OTM and IoT Fleet Monitoring


Oracle Transportation Management (OTM) is known for logistics planning and execution across supply chain. One of the major challenge for Shippers and Logistics Service Providers is track the shipment on the move and identify any disruptions during the course of transportation.

Oracle IoT Fleet Monitoring provides real time transportation visibility which will help in

  • Provides single window to view real time shipment status

  • Map view displays the exact location of vehicle which helps in deriving ETA of shipment.

  • Define geo-fencing rules for route deviations and unauthorized stops.

  • Define rules for incidents and warnings.

  • Shipments Planned in OTM can be automatically pushed to IoT Fleet Monitoring.

OTM -IoT Fleet Monitoring Integration

IoT Fleet Monitoring provides REST API's which will help in seamless integration with OTM. Based on events in OTM, shipment can be exported into IoT FM automatically. Shipment location, arrival/departure times can be published from IoT FM to OTM.



Benefits of IoT Fleet Monitoring for


  • Notify in advance for any potential delays.

  • Integrate with 3rd party TMS solutions.

  • Reduce the time spent on track and trace.

  • Get notified on arrival/departure of shipments.

  • Integrate with TMS to have a single view.


  • Notify in advance for any potential delays.

  • Shipments transported under specified conditions.

  • Reduce the time spent on track and trace.

  • Integrate with TMS to have a single view.

  • Get notified when shipment near consignee.


  • Notify when driver deviates from assigned route

  • Share real time locations with customers.

  • Identify Fuel Spending patterns and the reasons for extra fuel spent.

  • Integrate with TMS to have a single view.

  • Identify vehicle issues if any.

By Ravi Kiran Gurujala, OTM CoE.

Oracle Supply Chain Planning Cloud: Product Evaluation and Co-Existence Implementation Scenarios

Supply chain planning cloud R13 promises to offer end to end planning solution, however, customers are still struggling with co-existence options with on-premise execution system with nuances of planning functionalities that are critical to them. Through this blog, I will share my experiences providing key inputs on the criteria's to consider to arrive at a decision between on-premise, cloud or a co-existence of demand planning. At the outset, there are two major product evaluation considerations:

  • Arrive at a list of key product evaluation criteria's important to your customer based on its weightage
  • Evaluate the key functionalities of cloud product in the order of importance to the customer and assist in making product decision

Product Evaluation Key Criteria's

While products can be compared on multiple factors, but it is of perennial importance to first shortlist the key criteria's in the tall order of business importance. Following are some of the comparative analysis I have considered in cloud product evaluation with customers embarking on supply chain planning cloud

    • Product Fitment : Scoring based evaluation of product features with weightages based on fitment criterion
    • Technology Consideration: The underlying technology of product and its fitment into your customer's application ecosystems.
    • Scalability: The scalability of the product considering the ability to accommodate your customer's unique processes, ability to extend functionality as well as integrate with other disparate systems. For example, high-tech customers want to consider binning, operational yields, unique customer allocations, and custom ATP
    • Customer Base: The existing customer base of the product and well are they catering to their customer
    • ERP Ecosystem: The ecosystem of products surrounding the core ERP product as also how well matured are these surrounding systems
    • Integration with 3rd party systems: Ease of integration with 3rd party systems with respect to supply chain information such as forecast, supply commits, Inventory statements and other collaborative information with partners
    • Customization Capabilities: Ease of customization to embed additional business logic such as planning allocation, aggregation, and disaggregation, ATP allocation etc.
    • Implementation Effort: One time overall effort needed to implement the product both from product stability and product vendor responsiveness to fix issues and also future roadmap
    • Upgrades: Future enhancements and customers involved in testing and deployment
    • Consulting Community: Skilled resources availability in the market that can cost your customer for both implementation and support

Product evaluation through fitment comparison

Product features undoubtedly form the most critical basis to determine how well the product can support business processes that are most important to the business. No product is 100% fit to each business but it is important to weigh it based on criticality to business and reach to an inflection point between introducing standardization and risk of losing flexibility. The product fitment approach recommended is to list the important feature in each functional area and rate it based on pre-set weightage criteria of fitment importance.

  • High - There should be direct fitment of functional requirement in out of box without any customization
  • Medium - Not necessarily a direct fitment of functional requirement, but achievable with minor tweaks 
  • Low - There is a no direct functionality as is the requirement and Major development effort will be needed 

Following are some of the example of features that can be considered for evaluating Demantra On-Premise versus Demand planning cloud and Planning central.

  • Data Collection From non-Cloud ERP
  • Import and export of custom forecast streams
  • Accuracy of Statistical Forecast
  • Flexibility of forecast tuning
  • Configuration of custom forecast Hierarchy
  • Forecast collaboration and Approval
  • Forecast export to supply plan
  • Forecast accuracy measures and reporting
  • Exception Management
  • Causal Factor maintenance
  • Custom logic for aggregation/disaggregation
  • Manage multiple demand signals
  • Simulation capabilities
  • New product introduction

In my next blog, I would provide perspective on supply planning and scheduling capability

January 16, 2018

Automating file based data loads for Hyperion Cloud applications

Automation is the need of the hour! Time has come when everyone wants minimal human intervention in any process or task especially when it is a periodic activity (daily, weekly, fortnightly, monthly etc) and data analytics is no different. Clients look for maximum automation especially in the data preparation processes so that they spend more time on planning and strategizing.

Data management is part of every Hyperion cloud application viz. PBCS, EPBCS, FCCS, PCMCS, ARCS, TRCS. In the simplest terms, the role of Data Management (referred to as DM henceforth) is to bring together data from various ERP source systems, transform it and load it to the cloud applications. It is possible to extract data from any on-premise or cloud source system either through direct integration or through flat files. Oracle provides direct integration of Hyperion cloud applications with Oracle Financials Cloud, Budgetary Control, HCM cloud, NetSuite.

However, to extract data from other source systems or third party applications, a flat file or comma delimited file is needed which is used as a source for DM. This makes the file based integrations a manual process where the source system has to extract and provide data in a file and then this file will be imported into DM. However, this manual step can be eliminated by using some scripting techniques and APIs. You can automate any data load process as long as you are able to access and extract data from the source system using any scripting or programming language.

Below is an example of one such kind of automation where the source system is a third party on-premise software having Oracle DB as the backend database and target application is FCCS. The solution is implemented in three basic steps:

1.       Call a Jython script to extract data from third party source system into a file.

2.       Upload the file onto the DM inbox folder

3.       Execute the data load rule to load data to FCCS.

All this is bundled into a batch script which is executed from the command prompt or can be scheduled to execute on the desired date and time.

1.       Jython script to extract data from source system :

I have implemented this piece of code using Jython. It can be implemented using any other scripting or programming language as well. 

This is psedo code and is to be used as a guideline only

import java.sql as sql

import sys

# Period and Year passed as arguments to the script

VARPERIOD = sys.argv[1]


VARYEAR = sys.argv[2]


# Connect to source system database

sourceConn = sql.DriverManager.getConnection("<<jdbc url, username, password>>")


stmt = sourceConn.createStatement()

stmtRS = stmt.executeQuery(selectStmt)

myCount = 0

outfilename = "OUTPUTFILE.TXT"

outfile = open(outfilename, "w")

outfile.write("ACCOUNT;ENTITY;ICP;DEPT;AMOUNT" + chr(10))


  myCount = myCount + 1

  ACCOUNT = stmtRS.getString("ACCOUNT")

  ENTITY = stmtRS.getString("ENTITY")

  ICP = stmtRS.getString("ICP")

  DEPT = stmtRS.getString("DEPT")

  AMOUNT = stmtRS.getBigDecimal("AMOUNT")

  mystr = str(ACCOUNT) + ";" + str(ENTITY) + ";" + str(ICP) + ";" + str(DEPT) + ";" + str(AMOUNT) + chr(10)






2.       Upload the file OUTPUTFILE.TXT onto the DM inbox folder and execute the data load rule:

These steps are achieved by EPMAutomate script which is similar to a batch script

@echo off

SET /p period=Enter Period in MMM format:

If %period% == "" goto :ERROR_P

SET /p year=Enter Year in YYYY format:

If %year% == "" goto :ERROR_P

echo Executing script to extract data from source system

cd /d c:\jythonscripts

jython -J-cp ojdbc6.jar %period% %year%


echo Source extract complete

echo Logon to EPM Cloud

call epmautomate login <<username>> <<password>> <<url>> <<domain>>


echo Upload file to inbox folder

call epmautomate uploadfile c:\jythonscripts\OUTPUTFILE.TXT inbox/Source_Files


echo Executing load rule

call epmautomate rundatarule DLR_LOAD_FCCS %period%-%year% %period%-%year% REPLACE MERGE Source_Files/OUTPUTFILE.TXT



echo Scheduled Task Completed successfully

call epmautomate logout




echo Failed with error %ERRORLEVEL%

call epmautomate logout




echo Period and/or Year cannot be blank!




Points to note:

1.       This automation process is for EPM cloud applications where on-premise FDMEE is not available and Data Management has to be used.

2.       This is not the only way to achieve automation. It is the way I have implemented according to my project requirements. There are other ways of automating as well using a combination of APIs of the source system (if available), EPM REST API and batch scripts.

3.       This approach requires EPMAutomate, Jython and the batch files to be hosted on a local workstation or server which will be on-premise even though target Hyperion applications are on cloud.

4.       Further to this, email notifications can also be configured to send out emails to stakeholders after a particular data load completes successfully or errors out. This can be achieved using email programs like BLAT, POWERSHELL, MAPISEND, MAILSEND etc. These programs are available for free download and can be configured in a windows batch and called after the data load step is complete.

Write-Back of Budget Data from PBCS to Fusion-GL through FDMEE


PBCS is the latest release from Oracle which is a user friendly tool with less amount of efforts needed to the taken care from a developer point of view. It has been made so user friendly to developers that there is no need of installation and maintenance that needs to be performed. Only creation of application and artifacts which are required in a functional perspective needs to be developed.

For PBCS there are few ways to load data into the application.

1)   Loading directly through planning data load utility, by creating a load file in a planning format and loading into application.

2)   Through EPM Automate Utility, in this the load format file can be uploaded to planning INBOX and from there the automation can be done to load into planning application.

3)   Through Data Management (FDMEE), in this we can integrate the source application with the target application by creating the mappings between two systems.

Post loading source data into the PBCS application, and after performing budgets in Hyperion Planning application, we can write back the budget data to source system through Data Management (FDMEE).


Brief level of writing back of budget data from PBCS to Fusion GL.

Steps to create of Write back of budget data from PBCS to Fusion GL.

As a start first extract all the information required from Fusion environment to map with PBCS. We are required with Ledger IDs, Accounting calendar, Ledger currency and Chart of Accounts.

Login into Fusion cloud system, and after logging in from NavigatoràToolsàSetup and Maintenance, search for the task Manage primary ledgers. And note down the information regarding the Ledger IDs created in Fusion GL, which are required to map in FDMEE to a target Hyperion Planning application.

Once all the information has been gathered now login into PBCS workspace.

Login into PBCS, click on Navigateà AdministeràData Management as per below.

PBCS Cloud Workspace.png

From Data Management, setup the Source system as General-ledger.

Note: Both the Source and Target system naming convention should be the same.

Setup the Target Application as below General-Ledger, the application type while creating should be Custom Application, and map all the dimensions.

Note: Custom Application is the one which extracts the data from PBCS.

Map the Account dimension to Budget Name and Ledger as Entity, which are required in future while performing data load mappings.


Set up the Import Format.

Create the Location for the respective Import format by selecting General-Ledger as a Target application from the list of applications.

Set up the Period Mappings for the General-Ledger application and select Application mappings. And select the target application from the list of applications as General-Ledger.

Set up the Category Mappings to Budget.

Note: The Category name should be the same as per the Fusion GL Category name.

Create a Data load rule as per below.

Data Load Rule.png

After creation of Data Load rule, create the data load mappings.

Note: Budget Name should be mapped to Accounts, and Ledgers (Ledgers are the Entities in Fusion GL application, which define the business areas), should be mapped to Entities in PBCS as per below.

And map all rest of the dimensions to fusion segments.


After all the setups are done, below is the process of writing back the budget data from PBCS to Fusion GL.


Login into PBCS, click on NavigateàAdministeràData Management as per below.

From Data Management go to workflow tab and click on Data load rule.

Select the required Location, period and Category.


Click on Location to select. Select the Location, select the Category as Budget. And for period select the period for which budget data needs to be written back to GL.

Data can also be filtered by applying filter options to each dimension members, so that only restricted data can be written back for only particular combinations.

Click on Execute option to load data.

Another window appears as per below screenshot. Select the options required Import from Source (which import the data file from PBCS). Export to target (which exports the data to Target system Fusion GL). Select period and Import mode as required.


Click on Run, and after running FDMEE provides and Process ID which can be viewed from Process Details tab as per below.

Process Details.png

Click on Refresh option to refresh the data load process.


From Data Load Workbench below.

Data Load Workbench.png

In data load workbench all the steps of data loads are in Orange color, which states that data has been loaded successfully, and also we can reconcile the data loaded in export mode as per below screenshot.

Data Load Workbench1.png

Metadata Members and data columns are divided into different columns.

Data Load Workbench2.png

Now to view the same data in Fusion GL, Connect to Fusion GL URL workspace: -

After Connecting, navigate through NavigatoràToolsàScheduled Process

From Schedule process, check that whether Validate and Upload Budgets schedules have been running and succeeded. These are the schedules which run after data has been written back from FDMEE to Fusion GL.

Once these schedules are succeeded.

Go to NavigatoràGeneral Accounting àFinancial Reporting Center.

From Financial Reporting Center go to Tasksà Balance InquiryàInquire of Detail Balances.

From the detail balances screen, first select the Data Access set for which the data needs to be viewed. And select all the combinations of members from each segments.

And for scenario member select the Budget member, as data written back from PBCS will be stored in Budget member of Fusion GL.

Once all the segment members have been selected, the data can be viewed.

And data can also be compared between two systems PBCS and Fusion GL for reconciliation.

Note: The process of Writing back of Budget data from PBCS to Fusion application helps in maintaining the Budgetary Control. As Planning applications are mainly for budgeting the data and after budgeting if that data is written back to Fusion for an example below.

Example: If an end user orders to buy a Computer, and if he enters an entry for 40K, then if the budgeting authority must have fixed the price of computer to 35K, then the user gets an alert that the computer price exceeds beyond the limit which is the data from PBCS. And so he can send it further for exceptional approval process. In this way Write-Back helps in configuring the Budgetary Control.


Oracle Hyperion Planning and Budgeting Cloud Services admin guide.

January 14, 2018

Telecommunications as a Service

New Zealand is one of the most scenic countries in the world. It also has a government that is focused on improving the delivery of citizen centric services.

Under the Information and Communications Technology act (ICT), the government has launched Telecommunications as a Service (TaaS) initiative. Key features of TaaS are as follows:

• Provides a range of cross-government telecommunications and managed security services that allow various government departments and agencies to easily connect with each other and with their customers.
• Services are provided "as a service". Services are priced using flexible methods like per user per month. Agencies can easily scale up and scale down the services based on their requirements.
• These services do not require significant upfront capital expenditure. So agencies do not need to go through long and expensive procurement processes.
• Ongoing maintenance of the infrastructure and solutions will be done by the Telcos, thereby making the latest technology available for the agencies.
• Agencies can select individual services from a variety of providers. The services are expected work together seamlessly across providers and agencies creating a sort of internet for the government.
• Agencies can leverage data and information that is collected by their peer agencies as appropriate security policies would be put in place by the Telco providers.

The whole approach is very different from some of the other government programs that we have executed in other countries where the particular agency/department goes through a lengthy procurement cycle for infrastructure, hardware, software and professional services. Not to mention the significant cost and effort expended to maintain these solutions on an ongoing basis.

What does this mean for the Telcos?
The Telcos in New Zealand need to have their systems and processes in place in order to deliver to the requirements of TaaS. Conventional IT and network solutions will not be sufficient. The BSS/OSS systems of the Telcos would need to support the digital oriented experience that the government agencies expect.

Let me illustrate this with an example
Assume a scenario where a government agency already has a connectivity service and wishes to add other services to their existing sites.

Capability Needed

Solution Component

The user from the agency should be able to view their current connectivity details and request for the new services easily.

Digital  Channels

The system should automatically do a pre-qualification (availability and serviceability checks) to provide information on whether the requested service can be provided at the given site. The system can possibly make recommendations to the user for the optimal configuration that he/she can choose.

Digital   channels with inputs from BSS/OSS systems

The user should be able to provide the necessary configuration inputs and place an order for the new connectivity.

Digital  channels integrated with downstream BSS/OSS systems

Once the order is received, it can go through two scenarios

  • Service can be automatically fulfilled
  • Manual intervention is needed eg: Site visit to confirm design or install equipment. The manual tasks should get scheduled with customer confirmation and the process should follow due course.

BSS/OSS systems

Post fulfillment, the customer should be notified and should be able to view their new services. The customer billing should be updated based on their contract with the Telco.

Digital  channels integrated with downstream BSS/OSS systems

While the above seems logical, the existing landscape of the Telco can pose a number of challenges
• Siloed IT solutions - Automated fulfillment of services may be a challenge in an environment where there are different IT stacks to fulfill different kind of services.
• Manual processes - Manual hand offs lead to delays, data integrity issues and non-standardized fulfillment patterns. 
• Outdated network and IT solutions - Unless the Telco has kept their own networks and infrastructure current, it would be difficult for them to deliver the TaaS capabilities to their government clients.

The compliance to TaaS requires the Telcos to ensure that their own systems and processes are enabled to deliver the flexibility and seamless experience that the government expects. This is driving solution upgrades, consolidation and innovation within the Telco's own landscape.  The net result will be better quality of services to citizens and possibly to existing customers of the Telcos.

January 12, 2018

Configuring DAC

This blog covers how to configure DAC 10g, 11g - setting up the server and client, importing the repository to DAC, Configuring the Informatica service and database, Scheduling the load and Running a sample load.

After going through this blog user shall be able to Configure DAC and Run/Schedule loads.

Setting up the server -

To start up the DAC server please perform the below steps -

1. Start server by clicking on 'standaloneServerSetupPrompt.bat' under <DACInstallHome>/Oracle Business Intelligence Data Warehouse Administration Console 11g\dac

2. Select 'Enter repository connection information' by entering 1

3. Choose your connection type - 'Oracle< Thin>' in this case

4. Enter the service name -

5. Give the Database Host details -

6. Give the port name -

7. You can give the DB Driver and DB URL if needed

8. You can change the Table Owner name details -

9. You can change the Key -

10. Save the changes -

11. Now test the repository connection

12. Connection was successfully established

13. Exit

14. Click on 'Startserver.bat' under under< DACInstallHome>/Oracle Business Intelligence Data Warehouse Administration Console 11g\dac to start the server