Infosys’ blog on industry solutions, trends, business process transformation and global implementation in Oracle.

« February 2018 | Main | April 2018 »

March 31, 2018

Blockchain & Finance - An Introduction for the CFO

Have you heard about blockchain? Even if you have not heard about blockchain, you would surely have heard about bitcoin.  Bitcoins are not blockchain but Bitcoins use the blockchain technology.

Why should a CFO concern about blockchain technology?

The blockchain technology is a big game changer.  It can be used to solve many business problems. While some industries are hugely impacted, others might have minor impact. Also, since the technology is evolving and maturing new impacts are getting discovered every day. Ignoring the technology could mean loss of competitive advantage, inefficient process impacting shareholder value. As the guardian of the shareholder value, it is of great importance to the CFO to understand the technology in general and impact on finance function in particular.

Before, we discuss how the blockchain impacts the finance function, let us understand what blockchain is, what its unique features are, what are its benefits.

What's the name?

The blockchain technology is also sometime referred to as DLT i.e. Distributed Ledger technology. While there are minor differences between the two, to keep things simple, we can assume both are the same.

What is blockchain / DLT (Distributed Ledger Technology)?

As the name indicates the technology uses blocks, chains, is distributed (i.e. decentralized) and ledgers (list of data). Basically DLT uses blocks to store data, the data is linked / chained to each other most likely using cryptology.  Apart from data storage / linkage, in DLT the complete data will be replicated (distributed). The data in the block chain is stored based on a 'consensus' rule and blockchain might also have smart contracts, which gets executed based on certain criteria.

Blockchain / DLT (Distributed Ledger Technology) - How does it help?

Because of the above characteristics, a blockchain can help businesses

  • Speed up business processes - transactions taking days can be done in seconds.

  • Reduce costs - as it will enable direct peer-to-peer interaction without the need for intermediaries.

  • Reduces risks - as the transactions are immutable and cannot be changed ones created

  • Enforces and builds trust - all data is transparent and additions are through a consensus mechanism.

Maybe the above discussions are very technical, let me describe a finance use case for better understanding of the technology and the benefits. 

Trade Finance - Use Case - Using Oracle Cloud, Oracle Blockchain Cloud Service

Trade finance is one of the areas where the blockchain technology is already in use. Let us imagine a typical bill discounting scenario.  The scenario will have the following participants - buyer (say 'ABC Electronics'), seller (say 'LG Electronics'), and financing bank (say HSBC).  Assume we are the buyers, using Oracle Cloud applications.

ABC Electronics buys the goods from the LG, on receipt of the goods and the invoice from the LG, the details are sent (physical copies of invoice) to the HSBC bank. HSBC bank verifies the data and then releases funds to the LG based on the due date.

Note the above process

  • Might take 3-5 days, probably more

  • The participants to the process, do not have a visibility of the status - Are the goods received by the ABC Electronics, is the invoice received by the ABC Electronics, has HSBC bank got the document, has HSBC bank verified the documents.

  • The invoices might get damaged, lost, tampered with - as they move between the different parties.

How can Oracle Blockchain Cloud Service help here-

With blockchain we can now build a solution whereby

  • The business process of sending goods, receiving goods, receiving invoices, sending invoices to the buyer, verification of receipts and invoices by the buyer, sending the invoice to the bank can be captured / shared  on the blockchain

  • The transactions on consensus gets added to the block chain and cannot be tampered with (immutable)

  • Additions to the blockchain can be done by automatic process / manual process. Oracle Blockchain Cloud Service offers REST API's to automatically integrate the Oracle cloud applications with Oracle Blockchain Cloud Service.

  • New data can be added based on an agreed consensus mechanism, which can be built using Oracle Blockchain Cloud Service.

  • Oracle Blockchain Cloud Service also offers a front end application, which help the participants to view the status of the transactions (data transparency)

  • The physical invoices need not be sent to the bank, the bank can directly connect via RESTAPI offered by Oracle Blockchain Cloud Service, to verify the invoices captured by the buyer. ( eases and speeds up the process)

  • With Oracle Blockchain Cloud Service, a smart contracts can be built to automatic transfer amounts to the seller, on due verification of the invoices (process automation)

Below is the pictorial representation of how data (block) gets added to each node after each business event based on consensus between all participants and the same view is available to all participants.

With the above solution

  • The data is visible to all participants and is consistent across all participants.

  • Physical invoices need not be sent to the bank.

  • The correct invoice details are confirmed by all parties and cannot be tampered with (immutable). The ability is only possible due to the use of blockchain technology.

  • Smart contracts executed automatically to initiate supplier payments.

  • The time to process the payment to the seller can be done in few minutes instead of days

Are there other Use cases - Impacts on finance function?

While there is a big impact on financial services industry, crypto-currencies, the focus of this note is to discuss the impact on the finance function perspective, at a more micro level.

There are many other use cases. As the technology matures, the way it is implemented is also evolving and new use cases are getting discovered.

Oracle (in Oracle Open World 2017) while releasing the Blockchain Cloud Service solution, have listed a good set of questions which will help you determine the possible use cases for blockchain. Businesses need to check on below to discover potential use cases

  • Is my business process pre-dominantly cross departmental / cross organizational? ( think of intercompany reconciliation, interparty reconciliations)

  • Is there a trust issue among transacting parties? ( think of trade finance scenarios)

  • Does it involve intermediaries, possibly corruptible?

  • Does it require period reconciliations? ( think of intercompany reconciliation, interparty reconciliations)

  • Is there a need to improve traceability or audit trails? (think of bank confirmation letters, third party balance confirmation letters needed by auditors)

  • Do we need real time visibility of the current state of transactions? (think of publishing reports to various stakeholders)

  • Can I improve the business process by automating certain steps in it? (think of automatic payment, based on inspections by a third party).

From above, we can see numerous opportunities for improving the finance functions. Let me try to list possible use cases by critical functions of finance.

S Num



Possible impacts


Financial Management


Ø  Strategic Planning

Ø  Annual Planning

Ø  Rolling Forecasting (Quarterly / Monthly)

Ø  Working Capital management

Ø  Forex management

An internal, permissioned blockchain can be built to get consensus on the plan, which is transparent to all participants and immutable.


A permissioned blockchain can be setup to speed up the funds disbursement process for trade finance


Financial Reporting and Analysis


Ø  Statutory and External Reporting (GAAP / IFRS / VAT etc.)

Ø  Management Reporting (Scorecard, Dashboard)

Ø  Strategic Finance (Scenario Planning. M&A)

Ø  Customer and Product Profitability Analysis

Ø  Balance Sheet, P&L ,Cashflows

A permissioned blockchain can be setup for secured communication of reports which is secured, tamperproof, quick to publish.



Governance, Risk and Compliance


Ø  Financial Policies & Procedures (Business Rules Management)

Ø  Tax Strategies and Compliance

Ø  Tax  Accounting

Ø  Audit, Controls and SOX Compliance

Ø  Enterprise and Operational Risk Management

Ø  System Security and Controls

Secured communication of reports to government authorities.


A permissioned blockchain can be built to get consensus on the account balances for audit purposes.


Finance Transactions and Operations


Ø  General Accounting

Ø  Managerial Accounting

Ø  Accounts Payable

Ø  Credit and Collections

A permissioned blockchain can be built which is transparent, immutable and consensus based to capture customer promises for cash collections.


Financial Consolidation


Ø  Period end Book closure (monthly, quarterly, yearly)

Ø  Currency translation and trial balances

Ø  INTRA and INTER company transaction accounting

Ø  System of records close ( COA,  GL, Sub-ledgers)

A permissioned blockchain can be built to share and agree on intercompany balances.


Any pitfalls? What should you check?

There are many potential uses of this technology. As the technology matures and more Proof of concept projects get executed, new use cases are getting discovered and old use cases are also getting dropped.  As per Gartner Hype cycle, blockchain technology has passed the 'Peak of Inflated expectation' phase and is likely to enter in the 'Trough of Disillusionment' phase as POC's start failing before entering the 'Slope of entitlement' phase.

Considering the hype, there is a risk of trying to force-fit blockchain in scenarios, where simpler, cheaper, faster options might work better. While blockchain are immutable, highly secure, there are few exceptions and special attention is needed to ensure the exceptions are understood and managed. The government regulation to manage blockchain contracts also need to be evolve. There are also concerns with data transparency, which might not always be a good thing.


Blockchain is a big game changer.  Its impact on the finance function is inevitable. As the technology matures, the technology will help the CFO automate, speedup processes, build internal controls even with third parties outside the organization.  The CFO organization should start discussion on discovering use cases. It is likely that new ways of doing processes might be developed, in a way never imagined before.

The intention of the article is to give an introduction to blockchain, the impact on finance function and how Oracle Blockchain Cloud Service can help with build a block chain quickly.

March 28, 2018

Resource demand analysis with High Level Resource Planning

Resource feature getting used in Primavera is limited to the projects that are in hand/currently running and resource assignment is done appropriately for the same.

But for EPC Organizations having turnover more than $500M-$1Billion or more they always have a need to know the resource shortage/excess they have on board for the next 2 years or atleast for next 6 months or 1 year at a high level.

Organizations are using different methods to get this high level resource plan for company to company mainly to know the manual worker count required example for welders/fitters/riggers etc. in the next coming year or so where this requirement and usage are at peak level every year and having a difficulty to get at the right time such resources on board.

However this can be mitigated using a feature 'High Level Resource Planning' in Primavera Web module itself but the Primavera capability is used only for a narrow level of Project Planning, Scheduling, Resource Assignments capabilities by most of the EPC Organizations however the potential to tap it's usage is already inherent to plan even for an Organizational level Project staffing requirements and mainly for EPC sectors yet not dived into that functionality.

At any given point in time Primavera already provides the Resource analysis graph for the current running projects assignments cumulatively with high level resource planning being done now we are able to get the picture to depict on Manpower planning in future.

Resource planning flow of Organizations when High Level Resource Planning is in use:



March 26, 2018

Driving Business Intelligence with Automation

Business Intelligence applications are around for a very long time and have been widely used to represent facts using reports, graphs and visualizations. They have changed the way many companies use tools in a broader spectrum and in a more collaborative way. However, users these days are choosing these tools to be operated autonomously yet connected to a vast network that can be accessed anywhere. As BI applications analyze huge sets of data in its raw format, it makes one way or another a bit difficult to conclude intuitions and insights. Automation can help users achieve this in extending the use of Business Intelligence beyond its current capabilities just by pondering a few points in cognizance.

Intuition Detection
Automate the intuition.

Business Intelligence should be able to shoot out right questions in a click of a second and must be able to provide insights without any manual interference. Artificial Intelligence can do this in a better way using the potential of super computers to help Business Intelligence take a deep dive into the data. This makes it out to try all the available possibilities to make patterns. Leading in the discovery of trends in the business providing the needs of the user.

Intuition Blending
Automate what intuitions are prioritized.

The very idea of detecting insights in Business Intelligence should be automated in a way that it prioritizes the intuitions and ranks the worthy ones higher based on the user needs. Artificial Intelligence then further compares all the possible insights and makes relationships between them helping users to work on multiple insights at once.

Common Dialect
Automate how intuitions are defined.

Business Intelligent tools that are existing so far have done a boundless job in analyzing huge amounts of data through reports, graphs and several other graphical representations. But users must still have to figure out the insights on the best possibilities based on the analytics. This human factor leaves room for misconception, delusion and misinterpretation. This is where Artificial Intelligence takes the Business Intelligence next level and provides insights in best understandable language like English. Nevertheless, Graphs and reports can still be used to represent this accurate and widely comprehended solution.

Common Accessibility
Create a just-one-button-away experience.

Finding an insight should be very accessible and must be as simple as a click away. This is surely possible with Artificial Intelligence where BI is automated allowing user to get professional insights instantly just by clicking a button. This makes users to easily access the data without any prior knowledge on the tool or data science to make the intuitions. Available tools like Einstein Analytics from Salesforce has already got this implemented attracting huge set of users all over the globe.

Cut-Off Sway
Trust the data to reduce human fault and anticipation.

Artificial Intelligence generally reduces manual intervention thus avoiding the human errors. This sway could be because of individual views, misinterpretation, influenced conclusions, deep-rooted principles, faulty conventions. Automation by means of Artificial Intelligence gets rid of all these factors and reduces the risk of getting provoked by defaulted information. It purely trusts the data.

Hoard Intuitions
Increase efficiency by integrating insights.

Integrating the insights in the application alongside the factual data using Artificial Intelligence may make the users to stop using the BI tool directly. This hoarding ultimately makes the application an engine for other software increasing the effectiveness of the tool. Users can then spend much time in making wise money-spinning choices rather than putting unwanted efforts in using the tool. This can change the minds of many occupational users that presently do not use any sort of Business Intelligence tool.

March 20, 2018

HR Leaders at 'cross roads' again, but third time's the charm !

Most HR leaders and practitioners have witnessed the proverbial 'Change is the only constant' phenomenon in HCM space since last 50 - 60 years. These changes are across the functional areas like HR Administration, HR Service Delivery, Talent Management and Workforce Management. While there has been a sea change in employee centric operational and strategic HR processes over these years from business side, technology has aided or enabled these changes.

A quick analysis of these changes will unveil interesting factors.

  1. The pace of such changes in business and technology layer have been steady and controlled.

  2.  Most of the times, they were decided based on Organizations' operational or strategic requirements.

  3. Change initiatives are driven and executed by the Organization's own team or partners / vendors.

  4. A closer look will reveal that most of these factors are 'internal' not guided by the 'external' factors, baring statutory regulatory compliance.

Today, we witness that the 'external' factors are more forceful as compared to internal factors in guiding these change decisions in an organizations' HR practices. To understand the impact, it is important to note these chronological patterns which forced these change initiatives. Lets go thru these 'decision points' for change, which forced the HR Leaders into 'cross roads' for critical decision making about their HR practices.

Manual Vs. Automation

In the period where mainframe applications were being built or personal desktop revolution post 1985, HR practices in most organizations have been evaluated on a single question 'Manual Vs. Automation' to decide the scope of HR IT work and what comes into the HR Operations part.

Mundane and higher volume transactions like Payroll, Employee Administration or Time & Attendance which were centered around data keeping, rule driven and workflow driven activities were the initial targets to decide what is being automated and what remains a manual process.

Most legacy HR applications were designed and built on this paradigm and still are functional in the same way.

User driven Vs. Self-Service driven

As we progressed, all these multiple HR applications - both in-house and commercially available COTS solutions had to be redesigned with another key question - what processes are driven / enabled by the HR User and what HR processes have to be enabled for Employee or Manager Self-Service. HR leaders were left on another cross-roads at this moment, as the control of the HR process or operations have moved from Users to Employees in certain areas.

HR Leaders and Practitioners had to redesign the current HR processes to determine the operational processes retained by the HR Users and the processes that are delivered to Employee or Mangers on self-service basis. Most in-house HR applications and COTS applications have been redesigned to respond to this trend of 'empowering the employee'.

HR Processes across various business functions centered around approvals like leave or time-sheet approvals, workflow mechanisms, simple data driven queries like pay slips or leave balance checks etc..

Human Interface Vs. Digital AND Standardized Vs. Personalized

While it is tedious to be at 'cross roads' again, HR leaders are finding themselves in cross roads again - however the third time is definitely looking very exciting as compared to last two tides of change. Aided by the technology advancements which are changing the face of the world and the imperative to cater to diversified employee needs, the upcoming changes in HR business processes and associated HR IT applications is phenomenal. Digital and Employee Experience are going to drive these changes - an exciting opportunity for all HR leaders.

This time the questions are juxtaposed asking for multiple decisions:

Decision #1: What HR processes to be retained with 'Human / Human Interface' layer and what HR processes form the 'Digital Experience' layer...

Decision #2: What HR processes are to be delivered as Industry 'Standardized' processes and what HR processes need to be 'Personalized' to improve the Employee Experience.

Moreover, these decisions impact the entire HR functional areas and not limited to only certain areas where there are data-entry or operational or approval needs.

So, the task for HR leaders now is to determine how to break-up or slice & dice the HR processes and map them into the right areas in this two-dimensional model along with the benefits or value-adds depicted above.

As part of our HCM Advisory / Strategy offerings, the 'HCM process maturity model' assessment framework is re-engineered to suit the upcoming paradigm shifts in HCM space, starting with assessment and evaluation of the each L0 - L3 process to be mapped as per these two dimensions.  

Stay tuned to our 'HCM World' showcase and sessions, to get updates about this renewed 'process maturity model' framework.


March 18, 2018

ODI 12C Miscellaneous concepts:

In this blog I want to share the concepts related on the below topics


Export and import in ODI normally export only specific component and child components. Components under the selected mappings like temporary interfaces, reusable mappings and scenarios etc.

The exported components saved in the XML file format. Usually this process is preferred when there is a very minor change in the particular object and all environments are in sync.


Smart process:

Usual best practice of export and import is smart import/export process.Just drag the components to the wizard to be exported , it will automatically pick all the designer ,topology and dependencies used by  information related to corresponding component. Import action is similar vice versa process . Response file will track the log of import process.



Solutions in ODI:

Solutions are used to group the scenarios which were created for mappings, packages, procedures etc. and then we can synchronize the changes to the particular group and easily export and import to the respective environments. This would be helpful if a project contains more number of scenarios its difficult to find the changes and manually doing the migration process. So solutions are useful in grouping the required objects in the migration process. Mainly it was used in production support



Deployment in ODI:

Under every mapping in physical tab we can create new deployment option for the same mapping. Like changing the options in the knowledge modules or changing the type of KM's. Mostly it would be useful to run the full load or incremental load by creating new deployment and opting whichever is needed. This deployment is newly introduced in the 12c which was not present in 11g.


Commit and Isolation levels in Procedures:

Commit levels: Procedure is made up of group of tasks which are executed step by step .Each task is an SQL statement which needs to be committed. To manage the commit levels we have options like Auto commit and Transaction 0 to Transaction 9. Auto commit simply commits particular task in procedure. Transactions 0 to 9 are used to group the tasks need to commit at a time sequentially. This way we can manage the commit levels in ODI.

Isolation Levels: These levels are classified into four Committed, Uncommitted (Dirty), Repeatable, Serialazable.

Committed: The Tasks in the procedure will read the only committed data from the source.

Uncommitted (Dirty): The Tasks in the Procedure will read uncommitted data from the source (oracle doesn't support)

Repeatable: The Tasks in the Procedure while reading the huge amount of data it will lock the rows not to update at the point of time to avoid inconsistency.

Serialazable: The Tasks in the Procedure while reading the huge amount of data it will lock the respective table not to update at the point of time to avoid inconsistency.


March 15, 2018

Job Scheduler

A job scheduler is a system which enables the execution of jobs from the back-end. Job scheduling is also referred to as batch scheduling, or workload automation. The job-schedulers offers a GUI based platform for definition and monitoring of background processes running on a distributed sets of servers, applications, operating systems and networks. The scheduler works on simple logic. It simply picks up the jobs which are to be run on a day at particular time and processes them based on various constraints. It also offers the dependency kind-of model, in a way that the jobs can follow each other. Thus, day-to day activities of Hyperion, SAP, PeopleSoft, or mainframes admins can be easily automated based on the underlying business needs and timelines.

One of the popular job schedulers is Maestro, which offers all the wonderful features of graphical user interface. Few of the features of maestro are mentioned below: -

  • Central control: It offers central control for the processes configured from various platforms and applications. It saves from the hassle of dealing with the different platforms and applications present at different servers. It consolidates job scheduling across various systems like Linux, Unix, Windows, Mainframes etc. at one place.

  • Simple Interface: It offers a simple interface, which enables to easily track and manage jobs, schedules, tasks, job status, and job history helping in speedy diagnosis of the underlying issues. It offers views, which show jobs that are active, running, inactive, held, or that require assistance.

  • System Recovery: This feature makes sure that after a job fails, they are addressed automatically by putting dependent jobs on hold, run a recovery job (if defined), stop processing further jobs, re-run the original jobs, or continue to the next job, alert the operator via email.

  • Calendars and Schedules:  Maestro offers scheduling and calendar feature to manage batch job execution. Each schedule has a date, runs on that particular date/week/ calendar and contains a list of jobs which are defined to be executed on that particular date. Maestro carried forward the uncompleted schedules from the previous day. A calendar if an array of dates. Any number of calendars can be created to meet scheduling needs. For e.g.: a calendar name "month-end" can contain the list of the dates of "financial close", "Holiday" can have list of financial holidays when the jobs should not execute.

  • Queuing system: Maestro carries out a deterministic approach to identify the priority of the jobs. It considers parameters like elapsed execution time, estimated execution time, Job dependency, count of simultaneous jobs runs allowed for a user, file dependency, occurrence of prescribed events, resource availability, etc.

  • Security: The system is secure enough to check IP validation to prevent access by external systems.

First Timer for OAC- Essbase on Cloud..Let's get Hand's On -Rocket Speed!

First Timer for OAC- Essbase on Cloud..Let's get Hand's On -Rocket Speed!

Well, the launch of a rocket needs to consistently be on bar with 4.9 miles a second and yes this is 20 times the speed of sound. That's what defines the greatness and even the prime reason for a successful space launch.

There is going to be Blog series from Infosys on the Essbase on Cloud component of the OAC ( Oracle Analytics Cloud) Service [ should I say "oak" and "o.a.c"? I rather like it the former as it gained popularity that way the last 3 years]. This would constitute 6 Blogs in a staircase step fashion that is going to enable on-premise Essbase consultants and developer and also new bee from the BI world to gain control over OAC Essbase at Rocket speed! We start with landing page in blog 1 and will end with using Essbase as Sandbox for multiple data sources in blog 6(coming soon..)

Much to the release of OAC in 2014 as the most

a)       comprehensive analytics offering in the cloud

b)      having business intelligence

c)       big data analytics

d)      Embedded SaaS analytics

e)      Lately -Essbase on Cloud got released in the second half of 2017.

OAC now got even more packed with introductions of Synopsis, Mobile HD, the Day by Day App.


The first step...Set up an user in OAC - Essbase to move on..

As a pre-requisite, we would have an admin credentials to login and set up access for the other needful folks in your team!

OAC -Essbase login page: Once you click on the link for Essbase Analytics Cloud, you would see the entry gate to enter the admin user name password. Doing that and clicking on Sign-In button will take you to the Landing page.


This is the first screen called the Landing page. Since i have not created any application yet, you see an empty list in the left column. On the Right is a set of neatly arranged cards providing amazingly great ease of access -all that you might need in one view.




 Get your focus to the security icon to accomplish our work and that is obvious in the landing page. Those that are accustomed to the cards concepts in EPM Saas Products, this would be not be a surprise but rather on a white background.




Once you there, you would see three buttons on the right -"Import", "Export","Create" - very much self explanatry what they are meant for!! :) Apparently you by now know to Click on "Create" button to create new users:


 Provide ID, name, email, password , and role for the new user:



 List of Users of are currently available on OAC - Essbase will be listed here ..


Creating another user can be done in the same sequence of steps and/or via the Import option for Bulk creation of Users.

Please be cognizant of the fact that the above method is different from the Users and Roles managed via the IDCS Console. We will drill into the BI Analytics service instance specific application roles and authorized users in a sequel. The goal behind that security section revolves around the Access and Actions. A user can access only the data that is appropriate for him/her. This is achieved by applying access control in the form of catalog permissions in the Catalog and Home pages. Secondly, a user can perform only those actions that are appropriate to him/her. This is achieved by applying user rights in the form of privileges performed in the Administration page.

 Now that my id and access has been set up let's look at step 2 the Application creation in the blog 2(coming soon..soon as in tomorrow!..)

Thank you,

Prithiba Dhinakaran



The core intention of writing this blog is to describe features of legacy mainframe and the emerging era of Big Data. It also demonstrates the process of offloading data to Hadoop.  


Advantages of Mainframe:

Legacy systems used by many organizations are more secure, scalable and reliable machines which are capable of tackling huge workloads.

Mainframe handles even mission-critical applications with minimal resistance, like processing banking transactions, where both security and reliability and security are equally important.


Drawbacks of Mainframe:

They sustain Large hardware and software.

Processing Prices.


Many organizations, in the current era, take the urge to initiate a part of migration and continue the same in all the aspects of business applications to newer reliable platforms.

This process helps organizations to reduce costs incurred and meeting the evolving needs from the business 



Advantages of Big Data technology over Mainframe:

Cost Effective.


Fault Tolerant.


The cost in maintaining and to process the mainframe can further be reduces by the assimilating a layer of Hadoop or to completely off load the batch process to Hadoop


Similarities of mainframe and Hadoop are as below :

Distributed Frameworks

Handle massive volumes of data

Batch workloading

strong sorting


Business Benefits of adopting the Big Data Technologies along with Mainframe or over Mainframe


Using Hadoop ecosystems like PIG , HIVE or MapReduce the Batch processing can easily be done.


Jobs from the Mainframe systems can be taken and processed on the Hadoop the output of the same can be viewed at the mainframe end reducing million instructions per second (MIPS) costs.

(MIPS is a way to measure the cost of computing: the more MIPS delivered for the money)


Organizations look at return on investments at every corner during up-gradation or migration. similarly Migrating mainframe to Hadoop is this condition met due to minimal infrastructure , the batch process costs and flexible upgrade of applications.


Process of Offloading Data to Hadoop


Offloading approach is recommended in the following simple steps.

To create Active Archives and copies of limited mainframe datasets in the Hadoop distributed File system.

Secondly to migrate larger amount of data from the source from sources like semistructured data sources or Relational DBs

Final iteration of moving the expensive mainframe batch workloads to the much-sophisticated Hadoop




March 14, 2018

Hyperion EPMA Way

For developers in Hyperion community, choosing between Classic Vs EPMA approach has been a vital one. This blog gives a brief  insight to EPMA approach.

Features -

  1. For HFM applications EPMA provides a better interface for managing metadata
  2. Dimensions from Shared library can be shared in individual applications as Shared and/or Local dimension. Same Chart of Accounts can be used in HFM & Hyperion Planning application.

  3. Make bulk changes using Grid Editor . Select the members for the rows.













    Select the properties for the columns.












    Make changes to member properties for selected members, copy paste also works for multiple changes.

  4. Track changes made to Metadata

  5. Dimension can be maintained in single repository (Shared Library) , from shared library dimensions can be shared with applications in dimension library or local dimensions can be created .
  6. Metadata synchronization from Dimension library to shared library and vice versa can be done on demand, for Shared & Local dimension.

    Benefits over Classic applications-:

    1. For HFM, metadata can be managed in dimension library as compared to classic where metadata has to be exported using HFM client and then imported again after making changes.

    2. Managing metadata is easier. Make changes in shared library and it reflects in all target application  

    3. In EPMA the dependency upon the traditional ETL tools is reduced to a much extent. It provides a single interface where hierarchies can be built and loaded to various applications through flat files and interface tables.

    4. Through EPMA we get facility to automate dimension update and maintain applications like Import Profiles and Application Deployment can be invoked through the EPMA batch client for an automated dimension update process.

    5. EPMA integrates easily with database, flat file as well as DRM.

    Author -

    Jagjot Singh Chadha , ORCALL

    1. Part 3: PCMCS: What should you look at when you decide to leave the ground and fly with cloud?


      This is the part 3 of 3 blog series explaining basic feature differences & similarities in PCMCS - Profitability and Cost Management Cloud service Application and HPCM - Hyperion Profitability and Cost Management.

      In the last blog I had written about Data Management / Integrations, License Cost calculations, Future Roadmap and available Automation tools. Focus was on how PCMCS scores over HPCM in all these areas whether its availability of limited FDMEE for integrations within the same license cost or EPMAutomate utility as an automation tool for automating most of the PCMCS application jobs instead of custom script based automations.

      In this last part of blog series we would look at remaining areas in detail:

      1. Reporting and dashboards
      2. Security and SSO

      3. Maintenance & Backup

      4. Upgrade and Notifications

      Reporting and dashboards

      In both HPCM and PCMCS for any standard reporting Hyperion Financial Reporting web studio could be used and for Ad-hoc reporting excel plug in smart view could be utilized. But in case of PCMCS apart from Smart View and HFR Studio, Reporting dashboards and out-of-the-box profitability reports are also available. Also we can run basic financial reports from queries displayed in the Intelligence area in PCMCS. Given below is detailed analysis of reporting features:


      On Premise HPCM


      Oracle Hyperion Financial Reporting

      HFR Studio is part of HPCM as well as PCMCS License

      Smart View

      Smart view is an excel plugin for ad-hoc reporting needs

      MS Office Integrations

      MS Office integrated reporting in terms of Word, PPT and Excel is provided as part of License

      Data Base

      In HPCM standard Profitability module Essbase ASO database created along with calculation database acts as Reporting database. Once calculation has been run, data needs to be transferred from Calculation to reporting DB. For any kind of reporting this ASO database needs to be first updated and then Reporting can be done on top of this.

      In PCMCS backend there is an Essbase ASO database where calculations as well as reporting could be done from. There is no transfer of data required hence no time lag for reporting.

      System Reports

      ü  Stage balancing reports

      ü  Unassigned Drivers, Rules etc.

      ü  Trace Allocations


      ü  We can run basic financial reports from queries displayed in the Intelligence area. These reports can be further refined using Oracle Hyperion Financial Reporting.

      ü  Trace Allocations

      Genealogy Report

      Genealogy reports could be built to trace expense and revenues flowing from source to Target.

      No requirement to build genealogy report as source & target destinations are defined as part of each rule


      No Dashboards as part of standard HPCM license

      Embedded reporting and analytics are included in Dashboards in PCMCS. These are sets of small-scale charts that display values and trends:

      ü  Analysis Views: E.g. Most profitable regions/departments etc.

      ü  Scatter analysis graphs: Analyze data and difference at a particular period

      ü  Profit curves: Graphical curves, pie charts showcasing YTD Profit, MTD Profits etc.

      ü  Key Performance Indicators (KPIs)


      Sample Dashboard


      Security and SSO configurations:

      We create, maintain and provision users and groups for HPCM through EPM Workspace. Security Roles as listed below are defined in shared services console:

      • Administrator (admin is the default security role when you log on to Shared Services)
      • Power User

      • Interactive User

      • View user

      • Basic access to Oracle PCMCS instances is granted by assigning users to predefined functional roles.

      • Identity Domain Administrator  

      • Service Administrator

      • Power User

      • User

      • Viewer

        In PCMCS we also provision data access via Data Grants by defining data slices that can be assigned to users or groups to enable them to access the data in the defined data slice. There are predefined access groups,

      • Users and Viewers

      • Native groups, created by Identity Domain Administrators



        Configuring SSO for Cloud includes below steps

        Maintenance and Backup:

        In HPCM any system maintenance is an admin/IT job that needs to be performed at regular intervals. Backup process has to be built as a custom procedure while for PCMCS there is a standard daily backup provided as part of application landscape. For automation of backup process in HPCM custom batch scripts needs to created.  Whereas in PCMCS in case additional Backup needs to be placed in local server, EPM Automate utility in PCMCS could be utilized to automate the process of downloading application snapshot, dating and time stamping it and then placing it in designated server for required number of days. (Daily Backup download>EPM Automate places it in server location > Date and Time stamp it > Keep file for designated days > Deletes older file).

        HPCM Migration is via Life Cycle Management and needs to be manually triggered. PCMCS has in-built migration functionality to perform export/import of artifacts manually. This could be automated using EPMAutomate scripts.

        There is no scheduling features in HPCM and all application back-up and other job execution can be handled by custom created batch scripts. Whereas in PCMCS Batch execution and scheduling are available in Data Management of PCMCS. Also jobs could be scheduled using windows or Linux server task scheduler that could call EPM Automate script in turn.

        Upgrade and Notifications:

        Regular patch updates and notifications prior to upgrade is part of Oracle support platform. For any cloud EPM modules updates schedules are designed well in advance and notifications are available in server instance page. Oracle applies the monthly service update to test instance on the first Friday of every month and to the production instance on the third Friday of every month.

        But in case of on premise HPCM there are manual customer driven patch updates. Required patch upgrade information needs to be discussed with Oracle. Upgrade environment readiness and plan needs to be prepared for each environment. For HPCM application Disaster recovery plan needs to created and backup for Disaster recovery needs to be maintained as required. For PCMCS Disaster recovery is by oracle team

        Please find below links to Part 1 and Part of this blog series.

      Patching and Upgrade Planning for Oracle Financials Cloud

      Instance Patching and Upgrade is a way of life in Cloud world!

      Customers using Oracle Financials cloud are well accustomed to the frequent patching and instance upgrades.

      While the patches and upgrades overall improve features, performance and usability of application with enhanced functionalities; it can also create panic amongst financials users if not planned, tested and communicated on time.

      Here are some tasks to be performed before and after patching /upgrade for a seamless transition to new release.

      Pre-Activities: Before the patch is applied to instance or it is upgraded, following pre-activities should be completed:

      • Review the patch/upgrade Oracle notes and mark the ones impacting your current processes-- awaited bug resolution or new functionality.
      • Connect with Oracle by raising SR in case further explanation is require for any of the above notes. This will ensure no surprises when upgraded instance is handed over to users.
      • Compare the new FBDI templates with previous version and make a note of any changes. These changes need to be incorporated to the FBDI based Inbound interfaces.
      • Prepare and add the test cases for above points
      • Get the overall test cases reviewed by the functional team
      •  Inform the outage timings to users. Outage should include time taken to complete Post activities as well.
      •  Stop all the Inbound and Outbound interfaces to/from Oracle.
      •  As a best practice, patch/upgrade needs to be applied to the non-production environment first and tested/maintained with pre-post activities. Any issue faced during testing of Non-Production instances should be resolved before its applied to PROD. Some examples of such issues are:

      o    Access issues in case of custom security

      o    Formatting changes (e.g. page break ) to seeded reports

      o    Any other known Oracle bug

      These issues can be resolved with the help of Oracle team by raising Service Requests. The resolutions should be noted to be applied to Production as a post-task.

      • In case of any changes to FBDI templates, the inbound interfaces should be updated and tested in Non-Prod instance.

      Post-Activities: After receiving patching/upgrade completion notification from Oracle and before the instance is handed over to testing team or users, below listed tasks should be completed by Oracle Support team:

      • Run following processes:

      o    Import User and Role Application Security Data 

      o    LDAP Requests

      o    Update Person Search Keywords

      o    Synchronize Person Records

      o    Refresh Manager Hierarchy

      • Verify if the email notification is working (Notification on the Bell Icon) by either running a BIP report or changing a test person record. If its not working then an SR needs to be raised with Oracle to set the Notification Mode to ALL on SOA server and bounce the server afterwards.
      • Release the inbound and outbound interfaces to

      1.       Catch up on the data since it was stopped

      2.       Process as per regular schedule

      • Publish the active hierarchies to Essbase
      • Recreate the Essbase Cubes for active chart of accounts
      • Verify that Essbase is returning all Rules sets and Allocations in EPM
      • Activate the journal entry rule set assignments
      • Depending on the requirement, update the formatting of seeded report/s. E.g. Default Format, Page Break
      • Check for and Fix any subject area related error for OTBI reports.
      • Apply the fix for issues encountered during Non-Prod testing.

      While patches are applied to the instance either monthly or quarterly (as opted by the customer); upgrade frequency is lesser. The scale and impact of instance upgrade is higher; since functionality changes are major. Hence upgrade planning should start well in advance. The impact analysis of upgrade should be performed meticulously and elaborate test cases should be built.

      The Infosys Oracle Cloud team works with customers as trusted Support Partner and helps planning patching & upgrade transition; along with the regular production support activities like Period Close. The team tracks new features, presents to business and helps identify lag between new features & its adoption.

      The objective during patching or upgrade is to minimize the spike of support tickets from Financials users. The above mentioned pre and post activities help ensure the same and maintain a stable Production System. 

      March 12, 2018

      Data Science Fundamentals


      'Data'- the most commonly used word in every field these days. When you hear the term "Data Science", (also called as Data Driven Science) for the first time, you might think it's as complex as Rocket Science.

      What is data science?

      Data Science is nothing but the study of information from where it is coming and what it says and how it can be tuned into our own form. Information can come from anywhere like excel or from Database or some real time data or it might be of any other form. It can be structured or unstructured but every piece of information coming to the central location will convey something, that's where one of the crucial part lies - "what it says". After analyzing, we must be good enough in visualizing the same using tools like Excel or Power BI, Tableau etc. Finally, from the visualizations we build, we can extract the essential statistics and expect the trends. 

      Working with the Data:

      Let's assume we have sales details of Tea and Coffee at two different public locations by a vendor. Now, stall owner wants to extract some essential stats. based on those sales. Below is the sample screenshot of the data


      Using the above data, we can make up few more interesting things. Let's clean up and have a clear picture on what exactly the data is saying. Perform below steps and then look at the data.


      1. Let's add a "Total No of Units" column (=C+D) and sum up coffee and tea units sold on that day.

      2. Then add one more column "Revenue" which multiplies "Total No of Units" and "Price"

      3. Then add a day column(=TEXT(WEEKDAY(A),"dddd")) to the right of the Date


      After adding these columns, data will start speaking more.  PFB the screenshot of same.



      Now, we can draw different types of conclusions on the prepped data.

      Descriptive, like what are the totals units sold, how much is the revenue generated etc.,

      Associative, like how many units are sold in relative to temperature. Say if temperature is high, what is unit count and what is count if temperature is low.

      Comparative, like how many coffees vs Tea units

      Predictive, like how many I can sell on a weekday and how many I can sell on weekend (using Day column)

      These are few conclusions. But we still can explore the same data by adding few more things like

      1. Total units sold for all the days

      2. Total revenue generated

      3. Average units of Coffee and Tea sold per day