Infosys’ blog on industry solutions, trends, business process transformation and global implementation in Oracle.


July 17, 2017

Empower with Hyperion on Mobile

Mobiles and tablets have transformed today's world, enabling access to information and fast decision making by a mere touch. Oracle Enterprise Performance Management made entry in mobile space with release providing access to Hyperion application through handheld devices (Mobiles & Tablets). Access to Hyperion on mobile devices can be broadly categorized in two ways -

  1. A dedicated app on Android & iOS platform for enabling on the go reviews, approvals and workflow.
  2. Browser access through URL for Hyperion Planning artifacts (post configuring Tablet access in workspace) and Hyperion Financial reports.
Let us first talk about EPM mobile app -

EPM Mobile App

Installing app on your mobile or tablet is pretty much straightforward, visit App Store or Play store and download the app (talking about fast changing world, App is not compatible with latest iOS 10 & Android 7 yet ,but that is a story for another day).

Primary usage of this app is to enable on-the-fly reviews and approvals for -

·   Oracle Hyperion Planning
·   Oracle Planning and Budgeting Cloud System
·   Close Manager
·   Oracle Data Relationship Governance
·   Oracle Hyperion Financial Management
·   Oracle Hyperion Tax Provision

After installation, open the app and options are available for -

  • Configure Connection - Enter Details for connection URL and credentials.
  • Product Tour has demo to familiarize with interface. After clicking product Tour, it shows all available applications for different Hyperion products - Hyperion Planning, HFM, Tax Provisioning, Data relationship Governance & Financial Close Management.

Hyperion Planning

Select Scenario and Version for which workflow has to be reviewed.  There is also an option to filter by Status (Not Signed Off, Under Review etc. )


Select the Scenario, Version , Entity combination to view History and Promotional path details for Planning Unit


Click on Actions button to appropriate Action to move planning unit in Promotional path.        


Select HFM application from List of available applications. Promotion units are displayed for Scenario, Year Period combination. Promotion Units are grouped by Phase & Review Level. Filters are available to select appropriate Promotion units form Scenario, Year, Period, Phase, Review Level. Pass /Fail Staus and History can be viewed .  

Financial Close Management

Close Manager displays all the available close tasks, after Opening due task, take action to Approve or Reject task.  

Data Relationship Governance

Data relationship governance was not part of the first release of EPM Mobile application. It has been updated with latest release. All available requests in DRG application Requests can be seen, after selecting an item , click on Action button to take a decision on the request .



Browser Access

Before accessing application in tablet, Tablet access needs to be enabled in workspace after logging through a Desktop/laptop. Go to Administration > Manage > Tablet Access.

Select artifacts from list Forms, Task List and Rule which should be enabled for Tab access. (Keep in mind that display of Forms might vary as per the resolution properties and screen size of tablet)

From browser of tablet, access Planning URL http(s)://server:19000/HyperionPlanning .Enter Credentials. You can see the forms, Task lists and rules





Task Lists


Business Rules



While accessing reports from icon available , only Report Snapshots can be accessed. For Accessing Financial reports visit URL  http://server:19000/hr/mobile/HRMobileLogon.jsp  .  Enter Credentials ,after successful login, Folder structure for reports can be accessed

A sample report viewed on Tablet


  1. Refer Mobile Certification Tab for Compatibility details in Oracle EPM Compatibility matrix.
  2. Depending upon deployment, VPN might be required to connect to the application.

Simplified Interface - Life made simple for Hyperion Planning

·   Can we add Company Logo?

·   Can we change color theme?

·   Our management wants to access EPM application on Tablets, how can we do it?

·   How can we make scheduling jobs easier?

As a Hyperion planning Consultant these used to be a common queries from clients, and for all above queries there was a workaround (not simple at times), but with release of EPM an alternative to vanilla user interface was made available - Simplified Interface for Hyperion Planning. Its focus was to enable applications access on Tablets.

It provided an appealing user interface for administrators to create and manage an application, and for planners to budget, analyze and review. Icon based UI makes working with Hyperion planning much easier. It was also a glimpse of changing User Interface for Oracle EPM Hyperion suite from On Premise to Cloud.


Simplified interface provides -

·   A better look & feel

·   Options to customize the interface (logo, theme, watermark), formatting on data forms.

·   Enhanced performance when navigating forms and entering data, now it uses client-side scripting technologies in the simplified interface.

·   Flexibility to add Company logo on Home page & option to add watermark background.

·   Better Job scheduling options

·   Create Dashboards which include forms, charts, external link and commentary.

·   Tablet friendly.   


Enable Simplified Interface

Simplified administration has to be initialized from planning administration.

For a new application, there are three options available for creating applications

1. Sample - Vision application is created with artifacts and data.
2. Simple - Custom application with 1 Plan Type, allows MDX member formulas only.There is no provision for business  rules, map reporting, copy data, copy version, exchange rates, or currency conversion. This is useful for creating a simple application and can always be converted to an advanced application while scaling up.
3. Advanced - Member Formulas and Business rules can be created, it cannot be transformed into a simple application.

After application creation , logging to Hyperion Planning application a new layout is visible



Provides a bird eye view of the application. Click on 'Overview' tab, and you can see total number of Tasks, Plans, Rules, and Approval Hierarchies.


Customize Appearance

Changing colors, adding logos in earlier versions required modifying EAR files, WAR files then redeploy Planning. With simplified interface it is much easier.For adding Logo, enter image URL in Logo URL field. For adding watermark, enter image URL in Theme URL field.  Few of the theme options available are as follows -       


Data Form Formatting

 A simple data form gives flexibility to perform data, Ad hoc and formatting tasks. These three groups are visible on Data form as tabs.                             

·  Data Tasks - Tasks like Cut, copy, paste Clear Data,adjust, Spread, adding supporting details, filtering data are available.

·  Ad Hoc Tasks - For ad hoc analysis on a form Zoom In/Out, Keep Remove, Move, Pivot tasks are present.

·  Format tab -  Font size color can be customized for the data form.  



Customized dashboards can be created to review data and analyze. Forms, Charts, URL's and commentary can be added while creating a Dashboard. For charts on Dashboard properties like Chart Type (Line, Bar , Pie Area , Scatter , Bubble Etc. ), Legend/Label Position , Axis properties can be customized .Web URL can be added on Dashboard .There is a section of adding comments also.  A sample dashboard


Job Scheduling

Scheduling jobs becomes easier, and gives flexibility to schedule following tasks -

  • Import Metadata
  • Import Data
  • Export Metadata
  • Export Data
  • Refresh Database
  • Plan Type Map

Exporting data gives options to select target location, file type, Dimensions in Row/Column and POV. Export Metadata - Here a single dimension or all dimensions can be exported to a location of choice, and job can be saved for scheduling as well. While Scheduling jobs, Start Date, Time Zone, recurring pattern and End Date can be selected.


Things to keep in mind while using Simplified interface -

  • Only Classic application can be created using Simplified Interface.
  • Few admin features are available only in desktop mode
  • For tablets only landscape mode works.
  • User experience on ipads is better compared to android tablets.
  • For Ipad use Safari, Chrome for android - as recommended by Oracle
  • Simplified Interface can be accessed through following URL ( http(s)://<planning server>:<port>/HyperionPlanning   and http(s)://<webserver>:<port>/HyperionPlanning  )

Continue reading " Simplified Interface - Life made simple for Hyperion Planning " »

July 12, 2017

FDM Rebranding : How to cope up

As we are all aware, that ERPi has been rebranded as FDMEE in release.

First question prior to migrating from FDM to FDMEE would be, if you are looking to keep FDM Classic in place or just move to FDMEE for data loading? FDM (Classic) and FDMEE can be made to run in parallel and the customer's license allows for this situation, but only in version

As far as version is concerned, there is no classic FDM at all. However, there is a utility provided by Oracle for the supposed conversion in the version

The supported releases for performing migration through the utility would be version 11.1.1.x & 11.1.2.x. The versions which are not supported at all include ERPI-FDM &

The target version supported is or higher.

Prior to kick off the migration, we must confirm that FDM (Source) is a perfectly running environment, ODI Studio - properly configured, and FDMEE (Target) should exist as well.

·        Working source FDM Application Database

·        Working FDMEE target environment

·        ODI studio Installed & Configured


Migrated Artefacts:

1.      Target Application Details

2.      Source System

3.      Import Formats

4.      Locations

5.      Period Mappings, Category Mappings & Logic Groups

6.      Data Load Mappings

7.      Data Load Rule & Workbench Data



Artefacts which are not migrated are as below: 

·        Security definition for task level & object level

·        Log files

·        Archive files & information

·       Import Files (defined in source)

·        Scripts:

VB scripts are still supported for event and custom scripts if FDMEE is installed on a windows server.  There are a few differences in the VB APIs, but most VB APIs and methods are still available.  If you move your scripts from VB to Jython, you will need to rewrite them in Jython.

Continue reading " FDM Rebranding : How to cope up " »

June 30, 2017

Oracle BI Cloud Service (BICS) - Future of Oracle Analytical Reporting?

        "How can you get the benefits of an enterprise-class Business Intelligence without the enterprise class costs and infrastructure?" - Oracle's BI Cloud Service is the answer to that question.

          With BICS, Oracle has been continuously striving to bring OBIEE on the cloud along with great cost savings. If that's not enough, BICS ensures faster time to market, immediate availability of latest updates/patches at no extra cost and also mobile ready without additional work. Hence it's a comprehensive solution that makes BICS the future of Oracle Analytical Reporting. 

Let's look at some of the aspects of BICS to understand it better. 

1.      Data:

Data, if present on premise, needs to be ported to Oracle DB cloud service for BICS Reporting. Oracle provides numerous options to achieve this, such as,

                    i. Data Sync Tool can be used to load data from files, JDBC data sources, OTBI, Oracle Service cloud, etc. It can be installed locally in Windows or UNIX machines. Data Sync is especially useful in case of combing data from multiple data sources and scheduling of incremental data loads. Supported databases are Oracle, DB2, Microsoft SQL server, MySQL, Teradata, TimesTen, Greenplum, Hive, Impala, Informix, MongoDB, PostgreSQL, Redshift, Salesforce and Sybase.

                  ii.  SQL developer can also be used to load data from relational databases.

                iii.   PL/SQL scripts, BICS REST API and Oracle Database Cloud Service API can be used for automating the data load process.

2.      Data Modelling:

a.      Modeler on Cloud:

Inside BICS, you have the Modeler tab, where the traditional Star and Snowflake Facts and Dimension model can be created and later published as a subject area for analyses and dashboards. This ensures that BICS is completely on cloud and no installation on client machine is required for repository data modelling and everything required for Analytical Reporting can be achieved using a browser alone.


b.      'Lift and Shift' On Premise Repository(RPD):

On the other hand, Lift & Shift is a feature that enables user to upload RPD built on premise to the BICS, provided there is a DBaaS (DB as a Service) instance available for the RPD to connect on premise. (Currently, Database included with BICS cannot be used for this.) Also, the BI Admin tool used should be or later and RPD must pass consistency check before being uploaded.

The RPD can then be uploaded to BICS using the 'Replace Data Model' option. During the upload process, an option can be selected for the RPD to use the database included with BICS or use the DBaaS used initially for configuring RPD.

Lift & Shift is especially useful in case of migrating from on premise Oracle BI Apps to BICS.


3.      Dashboards and Reports:

BICS Dashboards and Analyses are very similar if not same as that of OBIEE. This ensures familiarity and easy adaptability to self-service cloud for most BI users. In line with Oracle's Cloud first policy, newer visualizations, features. bug-fixes, etc., are made available in BICS on a continuous basis without additional costs or effort.


·         Lift & Shift of WebCatalog

Recently, Oracle has enabled Lift and Shift of OBIEE webcatalog too. This is a simple process provided OBIEE or later is available. Using on premise catalog manager, any '/shared' folder is renamed as '/company_shared' and then Archived. Inside BICS, the Archived file is imported in to a catalog folder and then Unarchived.

Application Roles can also be Lifted and Shifted from on premise OBIEE 12c. Additionally Oracle BI Apps requires connectivity to OLTP (e.g. Siebel, EBS) for variables and initialization blocks and this is achieved using Remote Data Connector (RDC).


Fig 1. Oracle BI on the Cloud

4      Security and Administration

BICS security is completely managed within Oracle without any 3rd party interfaces and ensures highest standard of fine grained data security. Dashboards and analyses can be easily shared by users, yet at the same time recipients will be able to see only what is accessible based on their application roles and access levels.

Since the physical infrastructure related Admin activities are taken care by oracle itself in the Cloud, the BICS Administration is lot more simplified compared to on premise, making it maintenance friendly.


5.      Snapshot Archival:

BICS can be backed up by simply creating a single Snapshot file of latest repository, catalog content and application roles combined. Snapshots can be migrated across different BICS environments (Oracle provides one Pre-Production environment for development/testing along with the Production environment). Since all existing content will be overwritten while restoring using Snapshot, caution is required. Once restored, previously used content cannot be recovered, unless it was backed-up using snapshot. Also, while restoring, all active user sessions will be terminated, hence should preferably be performed during scheduled maintenance window.

Snapshot is useful in case of temporary termination of BICS too, wherein the snapshot file provided by Oracle can be used to restore BICS while renewing subscription later on.



6.      Why BICS?

Some of the Key reasons to consider moving to Cloud BI are,

a.      Lowest Total Cost of Ownership due to,

-          Predictable, Cost Effective Licensing.

-          No additional upgrade/patching costs.

-          Lesser Support and Maintenance required.

b.      If planned well, implementation cost incurred in migrating from on premise to Cloud BI can be recovered in the first year itself.

c.       Faster time to market.

d.      Mobile ready without additional configuration effort.

e.      No additional training required for users familiar with OBIEE Reporting.

f.        Quick Roll out of new charts and features, made available at no additional cost. 

g.      Flexibility to archive and renew subscription later as per requirement.

h.      Lift and Shift can be used to migrate Oracle BI Apps to BICS. Also, in specific scenarios, Lift and Shift can be used to ensure both On-Premise and Cloud BI are in Sync.

i.        Users require only a browser (web client) to access Self Service BI Reporting on the Cloud.

 -  Subramanian S.

UI less BI


Google Assistant, Amazon Alexa & Apple Siri have all become house hold names with built-in Artificial Intelligence and huge potential on the home automation segment. These seemingly elegant devices today were wild ideas appearing only in sci-fi movies few years back. Remember the charming C-3PO and his wise companion R2-D2 from Star Wars trotting around the universe, interpreting and even mimicking human conversation to provide assistance - Are we there yet? I'd like to think we almost are with new features and capabilities being added every day to automate through simulated conversation.

Imagine the same concept being extended to Enterprises through GENIE, an intelligent BI Assistant (just like the genie of Aladdin who works wonders on the command of his master). Wouldn't it be wonderful if you, the Director of Sales (lets dream big :)), had no need to login to the BI system and run your reports (on laptop, Smartphone or tablet). Instead while you are pouring that hot cup of coffee, BI GENIE coupled with AI & Voice systems took care of all of this on your behalf.

Hello GENIE! What was yesterday's total sales?

            GENIE responds by providing yesterday's total sales figure

OK GENIE! Give me the variance between the Forecast and actuals Sales realized?

            GENIE responds with the details and provides the details of regions/categories where the variance is beyond the threshold

OK GENIE! Send alert with detailed report to the Zonal Managers where the variance is >25%.

            GENIE sends the alert and provides confirmation

An intelligent Assistant like GENIE will provide simplicity in accessing BI reports, especially in a fast-paced world today where we are always looking for answers as quickly as possible. Gone are the days of having to wait for the 8 AM scheduled reports. With data being available at real-time/near real-time wouldn't it be fascinating to have the means to interpret this data as soon as it is available and conveniently as well through GENIE?

These are some tit-bits on how BI GENIE can be embedded in our daily life to keep us at ease and work as a huge effort and time saver for seemingly mundane tasks. This saved time and effort could be further channeled into far more productive activities. 

Of course building these systems are not without challenges especially when it comes to Security and integrating this with Voice recognition systems and making GENIE more context aware by using Natural Language Processing. But nevertheless this is an effort worth taking as the world is gradually leaning towards "UI less" systems or "Touch less" systems or "Conversational" systems.

Welcome to the future of BI! Stay tuned for more updates and progress on GENIE!

Let me know what your thoughts are and provide your valuable feedback/suggestions/comments

Continue reading " UI less BI " »

Gain Insight into your Spend through packaged Machine Learning


Organisations today face tremendous pressure to manage their direct and indirect spends efficiently to improve their Competitive Advantage.  Spending without adhering to corporate wide negotiated contracts, over spending for some categories, lack of quality and delay in lead time affects the overall profitability along with quality of Products/Services.

Main challenge to design a spend strategy is to have adequate visibility across organisation's various spends.  Many global organisations use multiple packaged ERPs and legacy systems to record their purchasing data in different subsidiaries and geographies. This often leads to spend categorization in multiple standards and some with no standard, ultimately posing difficulty for global reporting and analysis.

Conventionally organisations either ship their Spend data to a service provider or maintain an in-house analyst team to classify and report as per the agreed global taxonomy. In most of the cases these team carry out manual classification using some cleansing tools. It would be an expensive and time consuming process affecting ability for timely decision making.

Oracle Spend Classification is a bolt on module which works seamlessly with OBIA (Oracle Business Intelligence Application) Procurement & Spend Analytics and also with Oracle EBS iProcurement enabling efficient and dynamic spend classification leveraging pre-configured machine learning algorithm.

In the sections below, I would be explaining about the product, its implementation approach with the business context and how it adds value to Organisation. I would not be dwelling much on either technical architecture, UI of the product or construct of the algorithms.

How machine learning based decision making works?

I am trying here to explain the concept in brief using layman language as it is required for further understanding of this article.  Machine Learning from Data Analytics perspective can be defined as an advanced branch of Statistics and Computer Science to build algorithms, which can predict the outcome or make decisions based on the knowledge built from hidden patterns of a historic/sample data.  Key strength of machine learning is, it does not require explicit programming to interpret every aspect of input data for decision making.

There are many proven Machine Learning Algorithms relevant for Data Science, the key ones are Support Vector Machine, Naive Bayes and Generalized Linear Model.

machine learning_new.jpgSpend Category classification using machine learning

Spend categories recorded at various source systems would not be in standard format or they might be designed just to meet local ledger requirement or many a times most of the products/expenses are clubbed under miscellaneous bucket.So it becomes difficult to interpret and derive a correct spend category for thousands of transactions by global reporting team based on the Product Name ,Description or transaction description without consulting the geography / subsidiary procurement team. No amount of structured programming works as many a times the product name / descriptions might be in multiple - languages or in code formats and the nature of input data steadily changes. It poses difficulty in creating detailed global spend reporting as per the standard taxonomy.

Spend Classification module helps in populating the standard global spend category in Purchase Order, Purchase requisition and Invoice transaction lines extracted from the various source systems. Product creates a single text pattern by concatenating various dimensional attributes of transactions such as "Transaction Description"," Item Code", "Item Description", "Supplier Name", "Supplier Site", " Operating Unit", "UOM", "Currency" and " Cost Center" .  Based on the correctly classified Sample/Training set patterns representing all kinds of transactions, algorithm builds a Knowledge Base.

Knowledge acquired through training set will then be used by algorithm for populating standard spend category field in regular inflow of new Purchase Order / Invoice transactions.If the classification accuracy reduces over time due to change in nature of input data, same can be handled by improving training set and by enhancing the knowledge base. 

Implementation approach for Spend Classification

Spend Classification_new.jpg


  • Define the standard Spend Category Taxonomy to be used for global reporting for the products/services /expenses that's being purchased/expensed across the regions/Operating Units/Locations.
  • Create a Standard Template for extracting the Business Critical Spend and Procurement data for Non Oracle and Legacy Source systems. This template is used for extracting data into data warehouse using OBIA universal adapter.


  • Build the Spend data warehouse by customizing Out of the Box Procurement and Spend analytics data model.
  • Develop required ETL mappings by customizing OOB OBIA Procurement & Spend Analytics (ODI/Informatica) adapters for Oracle EBS, JD Edwards, SAP etc. and leverage Universal Adaptors for legacy/other source systems.
  • Configure the required standard taxonomy in the Spend Classification Module.


  • Manually cleanse and classify the consolidated Purchase Orders and Payment Invoices data to create a sample Training set covering all possible data patterns from various source systems/geographies/subsidiaries/categories etc.
  • Schedule the regular ETL and Build the Knowledge Base using the Training set with standard hierarchical support vector machine algorithm or with a customized advanced knowledge base using other algorithms.


  • Monitor the quality of classification output with new/changing data and accordingly add new training set to enhance the Knowledge Base.
  • Build Reports/Dashboards using Standard Taxonomy for global reporting to provide Spend Insights.

Product at a glance 

To explain the product in simple terms, it has got below components,

1. User Interface: Spend Classification's UI appear as a new module in OBIA Procurement & Spend Analytics with below Configuration Tabs/screens..,

  • Data set: Users can group AP Invoice, Purchase Order and Purchase requisition datasets imported from source systems based on various filtering criteria for carrying out classification actions.One can create or enhance knowledge Base or classify the data set using an already created knowledge base. User can also reset a classified, unapproved data set to clear the auto populated categories.Another major functionality in this screen is, users can download a data set or template of a data set and create training data set (sample set) with correct manual classification and upload it back for processing.
  •  Knowledge Base: Here user can see the list of Knowledge Bases, search a knowledge base, Delete a Knowledge Base and monitor the status of Knowledge Base creation activity. One can see whether a Knowledge Base creation activity is successful /Running state or it is has failed.
  • Classification Batches: In this tab user can view the list of classification batches, search a classification batch and Monitor the status of classification activity. Once the batches are reclassified, same can be approved to populate into OBIE Procurement & Spend Analytics. Batches can be reset and reclassified.
  • Classification Summary: It shows the status and details of classification process with KPIs in pivot and graphical format helping in making the approval decision. 
  • Classification Detail:  User can view/search/export the list of classified records and also can compare the Generated vs source system categories. Product also enables for manual correction of the auto-classified codes.

2. OOB API and Oracle Data Mining Engine: Spend Classification leverages Oracle Data Mining for creating the Knowledge base models. It comes with an out of the box API/Program based on Hierarchical Support Vector Machine Algorithm which builds the standard knowledge base.If there is a need for implementing other algorithms such as linear Support Vector Machine, Naïve Bayes then it needs to be done through Oracle Data Miner.

Product offers 2 standard (EBS, UNSPSC Category Code) and 3 custom taxonomies to be configured for global reporting. It means organisations can consolidate and report spends data using 5 standards to meet their reporting/analytics requirement.

3. Integration with OBIA Procurement & Spend Analytics and iProcurement: Product comes with OOB integrations with OBIA Procurement & Spend Analytics for populating the AP Invoice, Purchase Order and Purchase Requisitions. Once the fact lines classified and batches are approved, generated category code will automatically pushed to OBIA fact tables.Product also comes with the OOB integration for Oracle EBS iProcurement , which populates standardized product categories while raising Purchase Requisition and Purchase Order.


We know that machine learning based predictive analytics really helps in faster and efficient decision making, but it does come with an effort of developing /implementing algorithm and its integration with source /target data systems. Packaged machine learning modules such as Spend Classification tremendously benefits in terms of expedited implementation with out of the box integration to ETL, Business Intelligence and even inline validation at transactional systems (iProcurement).

Insights from Spend Classification benefits the organizations in Supplier Rationalization, Contract Leakage Reduction, Cost Savings, Improved lead time ultimately driving the Profitability, Quality and overall Competitive Advantage.

- Mahesh Kulkarni, BI Consultant 

June 23, 2017

Exploiting Hyperion for Scheduling Jobs !!

Problem Statement:

Have you ever faced a need to call batch files to update EPM system but no access to the server / environment ??

Windows task scheduler is disabled in Hyperion environment. Application backups are being manually triggered through batch scripts and there is a requirement to have it automatically scheduled for a particular time during the night, when none of the business users are accessing the application for development. The solution would be applicable for Hyperion version & later.


Solution Provided:

Conducted successful POC for multiple application backups through batch scripts scheduled in Hyperion workspace and ODI.


For Workspace:

1.      Create a Generic Job Application in workspace, enter the required parameters & click OK.

2.      Make sure, the MIME Type for (.bat) is predefined in Hyperion Workspace. If found missing, we need to add it manually to the existing list of MIME types.

3.      Now, import the batch script in the "Explore" menu of workspace as a Generic Job & select the created generic application in the dropdown.

Scheduling Process in Workspace:

1.      We can now test the batch file execution by a right-click on the Job Name & "Run as Job". Provide a path for output & click "Run"

2.      If successful, we can add a schedule to it, again by a right-click on the Job Name. Define the execution parameters & click "Finish".


For ODI:

1.      Create an ODI Package using Package Editor. Add an OS Command tool from the list of integrator tools available.

2.      Specify the location of batch script in the OS Command defined for the package.

Scheduling Process in ODI:

1.      Add a scenario to the package created for the backup.

2.      Under the scenario, right click on "Scheduling" and add a schedule.

3.      Define all the scheduling parameters, the logical agent and log level.

4.      To apply scheduling, we need to update schedules for the logical agent defined for the package.

Continue reading " Exploiting Hyperion for Scheduling Jobs !! " »

February 23, 2017

Leap in faith - EPM cloud is here to stay !

I recall, my first scuffle to cloud technologies in EPM space to sometime back in 2008. That time, any latent exposure to sensitive financial and strategic information on cloud would nip even the thought of moving away from on premise services. Besides, analysts remained in unison and maintained a view that it was really difficult to do real EPM in cloud, because one would probably be going to take feeds from lots of different databases and aggregate data to turn it into one version of the truth, and a SaaS system is not a candidate for that.

Fast forward to the present times - scales have tilted. Concerns on data security are waning and Organizations are appreciating benefits of adopting cloud technologies. It has opened flood gates for platform consolidations and this factor alone is turning to be a significant driver for graduating to cloud technologies.

However as noose is tightening on security concerns - technology and process challenges are coming to forefront. More diverse the existing EPM and ERP landscape, greater is the challenges towards transition to a complete cloud model. Thus amidst a visible Tsunami one aspect is evident that as cloud ecosystem penetrates the co-existence with on-premise is there to prevail. No wonder the initial thrust in EPM Cloud which was dominated by SaaS models is leading into  IaaS.

Furthermore, consumers of cloud adopters remain mixed and range from those making a big bang transition to those players who are experimenting cloud in distinct pockets. While the common factors driving this decision is realizations of quick wins, flexibility of usage and economic advantage; none-the-less the solution approach differs widely.

Key ingredients that players need to be on top in their EPM cloud journey are :

1. Process re-engineering - crafting FP&A or any other EPM process into cloud invariably brings a change in the manner things are dealt.  Quite often there is an urge to intertwine process improvements to such initiatives.  Often this ambitious desire turns counter-productive.  The mantra of success is to keep it simple and avoid mixing objectives.

2. Lift and shift is going to prevail especially during the transitionary phase. Thus it is imperative to make a judicious decision to carve key processes lest it injects process inefficiencies. Data migration is an equally important factor to be considered.

3. Master data harmonization - Metadata holds the key in defining any EPM application. This is more relevant in a hybrid ecosystem. Gaining a good hold on this element sets the path for a smooth migration to integrated information delivery.  Some of the other things to be considered include fulfilment of cross functional requirements.

4. Data integration strategy - in hybrid ecosystem transaction systems often remain in different platforms. Transition to cloud makes the pursuit of EPM integration a moving target. Data source definitions and sound integration strategy carry the biggest weight.

5. Past few years have witnessed giant leap in faith; to complement this brewing revolution last mile reporting capabilities need to augment further. This concerns bringing synergies across internal and external reporting aspects.

There is no silver lining or a panacea for success in the cloud world. All in all, there is a ground to be covered, learning's to be institutionalised across the board before we witness a mature and seamless cloud ecosystem.

Continue reading " Leap in faith - EPM cloud is here to stay ! " »

January 16, 2017

Why Hyperion Financial Consolidation and Close Cloud Service (FCCS) may have tough road to success

Anyone who has attended Oracle's sales session on planning and budgeting cloud service (PBCS) in last two years would have heard the statement, "PBCS is selling like hot cakes" from the sales representative. Well... yes, it was and has continued to have a successful journey. Its higher adoption has sown the seeds for new hyperion cloud offerings in 2016, one of which is financial close and consolidation service (FCCS).


FCCS is not HFM on cloud

Initially, many people thought that the offering was hyperion financial management (HFM) on cloud. Let's put a setting stone and clear the air that FCCS is not HFM on cloud. It is an Oracle cloud offering to support financial consolidation and close process with a completely different database strategy as compared to HFM. Yes, you guessed it right, FCCS is not on relational database but on the same database platform as PBCS. It has 11 preconfigured dimensions and two custom dimensions. It provides a cloud-based offering for consolidation and close reporting with highly standardized out-of-the-box features built on years of experience and best practices that Oracle has learnt with its premier tool, HFM. Above all, it requires less or minimal administration and is very cost-effective as compared to on premise HFM, but still it may have to be prepared for a bumpy ride and user adoption as compared to PBCS. Let's see why!!


Constraints to adopt FCCS: Look and feel is different

In interactions with few of our clients who already use HFM and are planning to upgrade or use other Hyperion modules but exploring a specialized consolidation tool, we observed hesitancy to adopt FCCS. One of the key reasons was that CFO office was still reluctant to move actuals to cloud. Some of the clients who were inclined to FCCS moved away when they spoke to or sought opinion from Hyperion managers / directors of other clients. The fact that the look and feel is different as compared to HFM and database is not relational, Essbase could be another factor though to be fair to Oracle, they are trying to have same database strategy for both FCCS and PBCS offerings. But, we all know that usually the close and consolidation group and financial planning and analysis (FP&A) group have a different point of view.


Less flexibility with chart of accounts also may be a deterrent. Users need to be aware that having standard hierarchies or high level account members in key dimensions like entities and accounts allow for standardized reporting, calculations, and inbuilt financial intelligence which ultimately help in faster implementations. Oracle understands that PBCS was a different ball game when compared to FCCS as it was primarily for plan data. Clients can try PBCS for different lines of business or design in a way not to have actuals data in PBCS and hence, they are ready to play the waiting game.


Leverage Infosys Hyperion competency

Infosys fully supports Oracle Cloud strategy and Oracle Hyperion cloud and on premise offerings. So, if your firm has any questions related to Oracle Hyperion suite or enterprise performance management (EPM) strategy, please contact us at

December 2, 2016

Trim your Financial Close Cycle with Oracle EPM Solutions