Infosys’ blog on industry solutions, trends, business process transformation and global implementation in Oracle.

« April 2015 | Main | August 2015 »

July 17, 2015

Critical Success Factors for a Hyperion EPM implementation program


Critical Success Factors for a Hyperion EPM implementation program

It is a very common but nonetheless very vital question from clients embarking on a Hyperion EPM Implementation program as to what factors will have the maximum impact on the outcome. In other words, what areas should be addressed diligently throughout the program to ensure a successful outcome? While no two programs are same and the influencing factors could vary widely depending on the client needs and its current state however there are some common areas which tend to play significant role in the success of a Hyperion EPM program. So let us take a closer look at these areas.

Executive Commitment: Executive commitment is very crucial for successful execution and delivery of a Hyperion EPM program.  There should be strong commitment from Finance as well as IT leadership. They should ensure availability of right resources from business and IT teams for the project. Business SMEs committed to the project must have a very good understanding of the current processes and future business requirements. Executives should have high degree of involvement with progress of the project and should extend full support in case of internal or external roadblocks. A PMO office and steering committee with representation from various teams is highly recommended to ensure regular tracking of progress and timely intervention in case of persisting issues.

Organization Change Management, User Training: Hyperion Implementation projects can vary from a major business transformation project to a pure technical change project (e.g. version upgrade) or anything in between. Depending on the type and the magnitude of the change, a well thought user engagement and training strategy is required to bring users on board with the change.  User's readiness to adopt the change plays a major role in how the system is received when it is rolled out.

The training plan should span across functional and technical areas and should have provision for on-demand continuous training e.g. online material, recorded sessions etc. In case of major changes, it is recommended that the execution approach allows for continuous user involvement throughout the project cycle via planned conference room pilots, demonstrations etc.

An OCM team should be setup to drive overall user communication and training activities as the implementation progresses.

Hyperion Infrastructure Stability:  Hyperion EPM infrastructure tends to be one of the most complex pieces in the overall solution because of involvement of variety of technical components across products suite. It requires availability of skilled resources to install, configure and support EPM infrastructure in an optimized manner.  It is very common to encounter issues related to product features or environment configuration during the implementation. It is recommended to maintain proper staffing levels for Infrastructure consultants throughout the project and also have well defined service structure with Oracle for timely resolution of issues.

There are also other options available now for Oracle EPM i.e. "Oracle EPM in Cloud" and "Hyperion on demand" which alleviate many of these issues with On Premise EPM infrastructure. They should be given a serious thought        and evaluation in the light of business requirements. They can turn out to be a cost effective alternative for the given business needs.

Out of box vs. Customization:  Hyperion EPM product suite offers rich out of box features to configure specific financial functionalities e.g. eliminations, currency translations, process management, top down planning, drill through etc.  There are enough out of box features within each product to configure a plain vanilla solution that can address the core business requirements without heavy customization.

However it also provides various options to extend or build new functionalities through custom programs using APIs, scripts etc.  In order to build a light maintenance, scalable and simple solution, focus should be on maximum usage of out of box features and minimal customizations. Every customization should be carefully evaluated for its potential business benefit vs. long term impact on maintenance and upgrade.  This is also important as each new EPM version introduces changes deprecating and de-supporting some old features. Any customization based on these features may lead to significant rework in future.  In general an overly customized solution tends to grow in complexity over time with regular changes and becomes very difficult and expensive to maintain.

Master Data Management Strategy: This used to be an afterthought till few years ago however with the kind of business value it has delivered over the last few years, master data management is now generally at the center of an implementation program and rightly so, as the master data is the lifeline of an EPM solution. EPM solutions drive their business functionality based on the application master data i.e. hierarchies, properties etc. and master data management plays a crucial role in ensuring how various EPM applications will function together and how the business changes will be uniformly applied across applications. The scope of master data management driven solutions is increasing even beyond EPM as organization are realizing the greater business benefits.  It will require another article to cover what and how of master data management for an EPM system however it is suffice to say that each and every program, whatever is the scale, should have a strategy on how application master data will be designed, integrated, maintained and managed in order to build a scalable and agile solution.

Data Integration Strategy: Data integration is very important component of overall Hyperion EPM solution. It drives the overall data quality and data timeliness of the solution.  A complex and heavily custom integration layer may result into a maintenance intensive solution.  It is important to use the right set of products and right architecture for integration layer of an EPM solution. Considering there are multiple options available to achieve Hyperion Integration e.g. FDMEE, ODI, Native Methods etc., a careful evaluation should be done based on the requirements and current client context.

As stated earlier, each EPM program is different and these factors will have varying influence on its outcome. However if these aspects are carefully evaluated and managed right from the beginning, the chances of a successful outcome increase multifold.

July 8, 2015

Data flows between Hyperion applications using FDMEE


In a scenario where a client has implemented all of the Hyperion suite of products viz. Financial Management, Planning, Reporting, data movement between all these applications becomes a necessity. Budget and forecast data is often required to flow from one application to another. Earlier this was achieved using the Data Synchronization tool that comes along with the EPM suite. However, this tool has very limited features and you will not have the luxury of defining mappings between the applications apart from the straight forward one-to-one explicit mappings. Though the latest FDMEE version of has this data synchronization feature as an out-of-the-box feature, here's how I implemented it in the FDMEE version which had no direct feature to support this requirement.

The following types of data flows can be implemented by the methods described for each of them:

1.       Planning to Planning: This is the simplest data flow and does not require any FDMEE intervention if there are not going to be any specific mappings (which is usually the case). Here data is extracted from the source planning application using a calc script and loaded to the target application using the load rule - both Essbase features.


2.       Planning to Financial Management: In this case, follow the below steps -

a.       Create a calc script to extract data from the source planning application.

b.      Create a windows batch which will call this script. The period and scenario can be passed as parameters to this batch from FDMEE so that the data is extracted for the POV selected by the user in FDMEE.

c.       In FDMEE, create an import format according to the format of the file generated by the calc script.

d.      Create the location and mappings per the requirement.

e.      Create a data load rule which will have 2 fixed parameters - the source file name and the path. The file name will be the same as the name of the file generated by the calc script. Path will be a shared location on the FDMEE server where this file will be generated by the calc script.

f.        In the BefImport event of FDMEE, write a script to call the batch (created in step 'b')

g.       When the load rule is run, the BefImport event will be triggered which will call the batch which will in turn call the calc script which will generate the source file with the fixed name at the fixed path.

h.      Once the file is generated, the import step will import it, validate and export the data to the target application.

To the end user executing the interface it appears as a regular data load with the 3 steps of Import, Validate and Export and the BefImport event takes care of making the source file available for the interface in the background.

3.       Financial Management to Planning: This is a 2 step process for the end user since there is no direct way of extracting data from HFM through FDMEE.

a.       In HFM, create a task flow to extract data from HFM application into a flat file.

b.      Provide a fixed name and path for the file that will be extracted through the task flow. The path has to be on the FDMEE server in a shared folder.

c.       Similar to the above design, create an import format in FDMEE according to the file format generated by the HFM task flow.

d.      Create the location and mappings

e.      Create a data load rule with 2 fixed parameters - the source file name and the path - same as that provided in the task flow.  

f.        When the task flow is executed, it will extract data into the source file at the saved path.

g.       When this is done and the FDMEE Import step is executed, FDMEE will import data from the same HFM task flow file since the details are already saved in the data load rule.

h.      Thereafter the regular FDMEE steps of Validate and Export will be executed to load data to the target application.

One catch here is that the file generated from the HFM task flow has the encoding as UTF-16 instead of the standard UTF-8 that FDMEE supports. Hence before executing this interface the user has to change the file encoding under "User Settings" to UTF-16 and then Import.

Using a combination of these 3 methods, data can be loaded between the Hyperion applications as required.

Subscribe to this blog's feed

Follow us on

Blogger Profiles

Infosys on Twitter