Infosys’ blog on industry solutions, trends, business process transformation and global implementation in Oracle.

« June 2017 | Main | August 2017 »

July 31, 2017

Strategic HR Organisation : The Way forward

Further to what we saw in  previous post1 and post2,  HR organization should be involved from the strategy planning phase onwards to identifying the KPI that adds values to the business
agreed upon by key stakeholders to be measured, provide freedom to modify the
HR policies in line with market needs and growing business needs, work on
digital dexterity innovations and track the returns out of such initiatives,
define the governance model and identify the HR score card that organization
wants to focus on, strategize and improve on the analytics and the data quality
that is fed in, work towards reduction of operational efforts and identify
really value adding measures that should be the focus for HR organization,
periodically review the HR strategy and align with market and business needs
will surely lead the HR organization to be recognized as a key contributor and
profit adding unit critical to the enterprise. 

 

 

blog3.GIF

 

 

                The HCM ERP product giants like Oracle and SAP are working towards integrating the
analytics and digital capability of their products in line with market trends
and growing business demands. It is  must
for the product developers to integrate options to plan HR strategy within the
Human Capital Management product, measure its effectiveness with new KPI
calculations, help to generate useful balance score cards and HR score card,
align the workforce goals to be profitable and help to measure them
effectively, identify proactively competency and skill gaps as the markets
evolve and help HR organization's   to
adjust to the dynamic needs, help automate HR operations proactively and bridge
gap between the end customer needs.

               

                Many technology oriented enterprises like Google, Microsoft, and Apple are focusing on
the front lines of HR innovation, largely because of the growing need for
specialized talent in technology innovations.  They have rightly identified human capital as
their key asset and right talent is in short supply while the competitors are
eager to lure talented high performers away. Even financial services and
corporate audit firms are innovating and banking on HR functions expertise to
predict and prevent unethical behavior so that the customer interests are safe
guarded. Innovative strategies to improve on human capital assets is the way
forward for any enterprise to embark on growth journey.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

References:

 

1) http://www.gallup.com/poll/180404/gallup-daily-employee-engagement.aspx

2)  http://www.gartner.com/newsroom/id/3287617

3) https://www.gartner.com/imagesrv/cio/pdf/cio_agenda_insights_2016.pdf

4) http://images.forbes.com/forbesinsights/StudyPDFs/Insights-CIOTransformationSurvey.pdf

5) Gartner: 100 Data and Analytics Predictions Through 2020 dt: 24 March 2016 | ID: G00301430 Analyst(s): Douglas Laney, Ankush
Jain

6) Gartner: IT Score for BI and Analytics dt: 08 July 2016 | ID: G00304066

Analyst(s): Cindi Howson, Alan D. Duncan

 

July 27, 2017

EPBCS Family Welcomes Strategic Modelling and many more!

Pink....Pink...& Pink..It is a Girl!! Welcome the new baby into the EPBCS Family - Strategic Modelling!

Date Of Birth - June 2017, 16th

June was a month of galore, happiness and fun for the EPBCS Family! Indeed! Be it from the longest list of Issue and bug fixes, up until the Brand new framework of Strategic Modelling, EPBCS is growing, grooming and gaining confidence. What first caught my eyes was it was always "Lite","Standard" and "Enterprise" and you can choose your application type as one of the above and was upgradable. Now I see a "Reporting","Standard","Enterprise". I am yet to try if Reporting is a direct replacement of Lite and whether it can be later turned on to Enterprise ( I am assuming it will be) but will action and confirm, to let it talk more.. J

Let's look at the variety of items that came in with this June patch under different categorization:

1.       New features - Strategy Modelling Framework - The cloud version of Hyperion Strategic Finance.

2.       Enhanced features -

a.        Business Rules gets groovy - Developers win-win Business rules that had all the limitation around type of operation that it can house. Long gone is the custom that Business rules are meant for standard Aggregation or custom calculation. The style that groovy sets in is like making the user use the tool in don't care mode or giving the user more leeway and stretched comfort zone. Lovers of groovy, here is your most exotic dish. Business Rules can now be made to do functions like in letting the user submit data only within a range or whatever sanity check you want to do and it all depends on how good a groovy programmer you are!

b.       Charts got upskilled with this new "Combination Charts". Standard charts with bars are old not gold though! Now you get to make row "a" into bars and row "b" into lines, so you can compare two data over a period of time. Like what the average sales for Store A and Store B over the two years in the same chart now! J

3.       Recuperated features -

a.       "Enable features" for all 5 frameworks will not hurt you anymore if you are doing any admin work. So if you are in the process of enabling new features or editing features or even letting go of already enabled features, EPBCS will now check if the last admin activity on the application is indeed completed and database refreshed! Can't thank enough, there is never going to be any bummer surprises and no more Missing case filed!!

b.       Another feature that fixed was the "Data Push" usages. When we intend to push data to the reporting cube it was like taking your hand round your head to touch the nose. You go the data clear option, instruct for a clear data, then get back to the data movement and instruct a push data. With the June patch, it's not a 2 step process anymore. You click on the push data command it got sensible enough to first clear it and then move the data from source cube to reporting cube! Oh yeah of course a small pop-up comes out to see if you are doing what you really intend to do!

4.       Impressive features -

Opening a Google chrome window from SmartView. Yes, you correctly read and so feel free to believe. These two guys have become best buds now. How you used to launch Smart view from web service, now you can launch Google Chrome from Smart view and enter the web service. This deserves a WOW!..Hold on not until you see what's coming next. How about executing all administrator activities from Smart view? By editing worksheets within the application template workbook, you define application properties, dimension and member properties, load data, manipulate security,access permissions, and substitution variables. Well, well..well...You can even create, edit, and delete application, of course with service administrator privileges.

5.       Integrations -

Oracle Net suite and in Workforce module FDMEE integration directly with HCM! It just struck me that there is more on this area than this blog can hold so lots will be there in my next ones to follow!

6.       Whiners - 

 

a.       Application design guidelines that help you with information of whether it suits your requirements...Honey you are not there yet!

b.       Cannot upload *.bat to EPBCS anymore L..It has to be standard files extensions and no more *.exe,*.bat for *.sh to the service. Spamming or Hacking or Virus..Someone has threatened our guy but not anymore. Additional security! Now I have to transform files to standard extensions and then use. Can live with it and sure worth it but will whine!

c.       Planner Role is not able to access Financial Reporting folders and reports. There has been a security upgrade for FR which has led to this.

7.       Miscellaneous fixes and Thank You for that! -

a.       Exporting Financial Reports to Excel works!

b.       Application settings can be saved from Navigation Menu!

c.       Grid POV now does show up in Financial report!

d.       Move a form around in a dashboard, the rows appear correctly!

e.       Create a new BSO cube and all dimensions get enabled without errors!

f.        SmartPush = Data Maps.Clear&Push!

g.       Once you click save in a form, the cursor still remain in the same cell!

h.       Dynamic report can be exported to Excel..finally!!


       Now getting to where we started, celebrating the new member into the EPBCS Family- Strategic Modelling, will store it for a while as you keep waiting for my next write-up! J

I won't take long..I promise! Happy Reading and Happy Waiting as EPBCS fever is everywhere and giving a tough competition to Nolanism! J

 

July 26, 2017

Strategic Modeling in EPM Cloud

Oracle Cloud offering in EPM space is expanding, and EPBCS(Enterprise Planning and Budgeting Cloud Service) offering was updated with a module for Strategic Modeling. Strategic Modeling is based on Hyperion Strategic Finance offering and has similar features in cloud.  Few key features of Strategic Modeling are -

  • Financial modeling and forecasting for Balance Sheet and Cash Flow

  • Manage Sophisticated Debt and capital structures. It has funding routine and Debt Scheduler utility.

  • Create Targets

  • Perform Financial Impact Analysis, create what if Scenarios on run time.

  • Present Focused financial Information

  • Make Informed Decisions

        

Business organization has different needs for operational planning and strategic plan & modeling. EPBCS Strategic modeling module provides seamless integration between operational Planning with Strategic Modeling.

                               

There are pre-defined Templates which can be used for data modeling, or new models can be created. Existing models cannot be modified but a copy of templates can be saved and modified for business needs.

For each Template set of accounts groups and Scenarios are defined. Account group varies depending upon nature of business.

http://theepmlab.com/wp-content/uploads/2017/06/35.png


http://theepmlab.com/wp-content/uploads/2017/06/36.png


http://theepmlab.com/wp-content/uploads/2017/06/37.png

http://theepmlab.com/wp-content/uploads/2017/06/38.png

Working with Models

You can enter data for historical values, assumption for forecast and value estimations.

Forecasting

Another feature of Strategic module is detailed forecasting. Values can be forecasted for Project accounts for periods by selecting from various methods like inputting values ,adjusting by value/percentage , take output value from a specific account for a specific period , freeform formulas or available forecast methods can be leveraged.

  • Predefined Forecast Methods
    • As Actual Value
    • Growth Rate
    • Growth Rate (Year Over year )
    • % Of Another Model
    • Days
    • Turns - For an account how frequently the balance turns, used in inventory forecasting.
    • Absolute Multiple of another account
    • Default Multiple of another account

Goal Seek

Fix a target value and then do calculations for associated drivers to reach the target value ,

 

What If Analysis

Performing what if analysis to understand how dynamic changes impact the bottom line. Using what if analysis organizations can simulate and strategize for all plausible business scenarios to minimize the risk and reduce uncertainty.

 

Audit Trail

To determine how values are calculated there is a feature Audit Trail, which works like a drill down to trace back steps for calculation.

  

Consolidate & Report

Consolidations can be created to aggregate values from child entities to Parent entity, this helps making informed decisions at top level. There are pre built reports available for analysis or customized reports can be created.



How's your day, Workday!

Since "Oracle made an offer to PeopleSoft that it couldn't refuse", we all were busy thinking that HRMS market would be captured by Oracle (which it did in the very start).  But as the time passed, we came to know why David Duffield (founder and former CEO PeopleSoft) would have left it for Oracle.

Times are changing and in this era where organizations want to minimize their spending  on administration cost of an application but at the same time want minimum downtime for scheduled maintenance (infrastructures are not assets anymore, they turn to liability with higher maintenance costs), and right there Workday was introduced for such customers.  But with each year its market capture has gone up (especially when it comes to HRMS). 

We can conclude that Workday had more futuristic thinking than Oracle from the fact that OBIEE cloud version was introduced 3 years back where as Workday (which is cloud based tool) was introduced in 2005 itself.

Leaving LMS and CRM apart, Workday is giving tough completion to PeopleSoft

And Customers who have switched from PeopleSoft's HRMS to Workday are not comfortable switching from OBIEE to Workday reporting when it comes to their reporting needs and right there IT Services organizations get a key skillset and offering to work on i.e. Integration of OBIEE with Workday that would replace previously implemented Integration of OBIEE with PeopleSoft

Challenges

Unlike PeopleSoft, Workday is a file based system.  While everything in PeopleSoft works on the concept of Effective date of a transaction, in Workday a new driver comes in i.e. Completion date of an event.  As we don't have any out-of-the-box file system adaptor provided in OBIEE-BIApps HR solution that can directly connect to Workday, only option that we are left with is to use Universal Adaptors provided by Oracle in DAC 10+ editions onwards (file adaptors).

However, to match the expected file format of file adaptors, a strong expertise in Workday Integration, Java along with XML basic knowledge is needed.

Using Oracle File Adaptors for Workday

That might sound unbelievable, but it works once you overcome above challenge and it works way smoother than PeopleSoft data source.  However, incremental load is not an effective option as Workday doesn't store history outside application (in files) beyond a point (that usually never exceeds 3 months).  So only option BI team is left with is to replicate delta in a set of tables (we can call them history tables) to compare the Workday data set on every following day.  Data archiving is not just a best practice but a necessity while using Workday as data source. Data is delivered via SFTP/FTP to ETL tool Source File location in various delimited files and we can limit the data based on our requirement of BI reporting

PeopleSoft data migration to Workday

For HRMS, Workday will take care of that but for BI system, all PeopleSoft tables must be migrated to data Warehouse which would be merged with staging tables of workday once we load the data from workday files (every time we execute out daily batch)

The big fight!

Leaving the competitors aside, customers expect PeopleSoft to be as simple as Workday.  On the other side, customers who had OBIEE as key reporting tool, expect Workday reporting to be as interactive as OBIEE-PeopleSoft reporting used to be. 

Who can gain out of this?

In this fight of simple versus complex, someone has huge gap to fill in.  Who?? Well "us", we "the IT service industry" can fill in that gap while that fight is going on.

July 25, 2017

Alternatives to Structured Analytics--Big Data and Hadoop:

What it is:

In traditional times tools like SQL Databases, files etc. were used to handle data and its issues. With evolution of social media unstructured content has led to exponential increase in the volume of data. With this huge volume traditional tools faced lot of issues to store, retrieve manipulate data and hence BIG DATA evolved. 'Big Data' consists various techniques& technologies for storing, retrieving, distributing, managing & analyzing extremely larger-sized datasets with high-velocity & different structures. It can manage data in any of forms viz. structured, non-structured, semi-structured which traditional data management methods are not capable of handling. Data in today's world is generated from versatile sources and should be integrated with different system at various rates. For processing large volume of data in less expensive and efficient method, parallelism is used as an integral component of Big Data. Precisely speaking the terms Big Data is a data set whose scale, diversity, complexity and integrity is managed by new and robust architecture. Its techniques, algorithms and analytics in it is used to manage and extract hidden knowledge from it.

For structuring Big Data, Hadoop is used as a core platform, and it also solves problem of making Big Data useful for analytical purposes. Hadoop is an open source software which enables distributed computing to make it RELIABLE and SCALABLE for huge data sets across clusters of different server nodes. It is designed to scale up from a single node server to thousands of machines or multiple nodes with a very high degree of fault tolerance. Hadoop is designed to supports applications which are written in high level languages like java, C++, python etc.

How it Works:

In Hadoop, data is distributed to different nodes and initial processing of the data happens locally on the node. It requires minimal communication between the nodes and the data. Whole structure which can be replicated multiple times hence it provides the better solution to Problems in Big Data. Hadoop is a software programming framework in order to support processing of large data sets in a highly distributed computational environment. Hadoop programming framework in a data warehouse will include data consolidation from different sources in one place as Big Data platform and can be accessed with OLPA servers. Data Warehouse will serve the analytics need of the users on which all the consolidated reporting will happen.

What runs behind the scene:

Hadoop is designed and developed on MapReduce model of programming which is a based on a principal of parallel programming across different nodes, combined to be referred as clusters. An individual machine is referred as node and hundreds of machines together are referred as Hadoop Clusters. In Hadoop cluster, data is distributed across all nodes in the form of small parts known as blocks. The size of each block is defaulted to 64 MB for an Apache Hadoop cluster. One single file is stored in small blocks on one server and its copies are saved on multiple servers present in a cluster. This allows map reduce functions to process on small subsets of data which are in turn part of large datasets. Map and reduce is the key component of Hadoop which provides huge scalability across multiple servers in a Hadoop cluster which is needed for BIG DATA processing.

Hadoop also includes a fault tolerant storage mechanism names as Hadoop Distributed File System or HDFS. HDFS is designed in such a way that it is able to store high volume of data and information, it can scale up incrementally and any failures in infrastructure can be survived without impact on any data. In simple ways Hadoop clusters can be built with less expensive computer systems  so that fault tolerant can be achieved and in case if one node fails, overall system operates in cluster mode without losing data or interrupting work, by shifting work to their working nodes in cluster.

My experience with Form Development in HFM

My experience with Form Development in HFM

 

I have being working on Oracle Hyperion products for past many years. Since my role always required me to focus more on solution design and project/program management, development work was restricted to more of Metadata built and business rule development with hardly any opportunity to work on Forms extensively. Fortunately I got the opportunity this time and it was good learning experience. So I thought why not share the same with all.

I will not go in to details about how the Forms are constructed but for those new to Hyperion Financial Management (HFM), Forms are UI for entering and viewing data at a Specific Intersection. Forms are extensively used for entering additional data which is out of the Trial Balance and/or cannot be uploaded directly from the transaction system.

You may have noticed that I have highlighted the word specific intersection. This is the most important part of form constructions, as every cell in a form is defined by the combination of each dimension defined in the application. So you can define dimension applicable to each cell in the form at:

  1. Page level: which is applicable throughout the form. eg Value: <Entity Curr Total>, View: YTD

  2. Column level: which is specific to each column. eg. Year: 1st Column 2017 2nd Column 2016.

  3. Row level: which is specific to each row. eg. Account.

  4. Specific cell level: This is done by overriding the dimension applicable as defined by the page, column and/or row. eg. Override Cell 1st row & 1st column by View as HYTD.

Though there might be some cell which are based on 'In Form' calculations or are blank, every other cell represents a specific data intersection in the application. Thus, utmost care should be taken while constructing Forms, else data will go to wrong intersection in the application and impact the final reported figures.

Now that we know that we can override the definition at cell level, let us dwell in it further. Override is nothing but replacing the dimension or formatting done at page, column or row level. So override can be defined in a column definition or row definition.

Points to remember about override:

  1. Row Override definition takes precedence over column level definition.

  2. Function like @Cur can be used in an override for Year and Period dimension.

  3. In cell data calculation can be done to display data differently.

    Eg. If certain accounts values are to be displayed in different signage you can use the cell override as show below:

    Cell multiplied by -1: R5=A#Misc_Expenses, Override(2,2,SCalc((A#Misc_Expenses)*-1)).

  4. Override can be used to override formatting like, Back ground colour, font, font weight (Bold/normal), Lines etc. E.g Override Background colour to grey in total column.

    (Be careful while overriding background colour as it will mask the invalid and data entry intersections, thus making it difficult to identify a wrong POV or data entry point.) 

  5. Override also helps in a very interesting functionality of inserting text in a particular cell and then retrieving it in a different form.

     

    Eg. If we want to enter date in the 1st column for the data in subsequent columns. Using Override(1,1,CellText:[Default]) in a Form will allow to enter text instead of values at the POV applicable to cell. The same data can be made visible in another Form using override Override(1,1,CellText:[Default],Readonly).

To view the data in another Form ensure that the POV of the display cell is same as the POV where the text is entered. Also ensure this cell is marked as 'ReadOnly' in the display Form to avoid and accidental change to the text from the display Form.

Some interesting finding:

  • Override does not allow @cur(-1) or so on, for year or period dimension. So, make the dimension a row or column member so you don't have to override.

  • Scale does not work on accounts type "Balance". Convert the account to Asset/Liability account.

  • Custom header Description cannot have "," else will be treated as a code/specification and anything after "," will not appear in the header.

  • In case multiple Dimensions are defined at column/Row level, order of 1st column/row is applicable to following columns/rows.

  • If you put ";" after style: in a row definition, then the background colour as defined in the column definition is used if not then back ground colour definition of row is used.

  • Some RGB values for colour definition:

    • RGB(180,205,180) - green

    • RGB(255,255,255) - White

    • RGB(251,251,251) - Grey

Food for thought

When some text is entered at a child level and we want to display the same in consolidated report how can we do it? What will be the challenges and how do we address those?

In these days where blogs are being written about very complex subjects, I hope I am able to create some excitement in your mind about something as mundane as HFM Forms. I look forward to bring some more such blogs. Until then Keep Consolidating!!! J .

July 21, 2017

Run Oracle Hyperion Cloud Jobs on Autopilot!!! Using EPMAutomate

 

What is EPMAutomate?

EPMAutomate is an utility helps in automating administrator's activities by allowing remote access and perform tasks within Oracle Enterprise Performance Management Cloud instances. EPM Cloud includes EPBCS (Oracle Enterprise Planning and Budgeting Cloud), PBCS (Oracle Planning and Budgeting Cloud), PCMCS (Oracle Profitability and Cost Management Cloud), FCCS (Oracle Financial Consolidation and Close Cloud), ARC (Oracle Account Reconciliation Cloud), TRCS (Oracle Tax Reporting Cloud), and EPRCS (Oracle Enterprise Performance Reporting Cloud).

Why it is required?

As in On-premise Hyperion world, ex. we have Maxl editor or ESSCMD editor for carrying out administrator activity using Maxl scripts / ESSCMD scripts for Hyperion Essbase module. But in Hyperion cloud world, these editors are not available due to which an alternate way of administering cloud modules has been provided by Oracle using EPMAutomate utility. In a similar fashion, many cloud modules are compatible to adopt EPMAutomate utility for carrying out administrator processes.

What EPMAutomate utility can do?

EPM Automate Utility allows Service Administrators in automating most of repetitive tasks which includes below mentioned activities:


How it works?

  • In Windows

Batch script on project server needs to trigger which will contain EPMAutomate commands to do activities in cloud instance and after completing cloud activities, EPMAutomate utility will disconnects from cloud and control will be passed back to batch script.

  • In Linux

Shell script on project server needs to trigger which will contain EPMAutomate commands to do activities in cloud instance and after completing cloud activities, EPMAutomate utility will disconnects from cloud and control will be passed back to Shell script.

    

EPMAutomate scripts as a Reusable Artifacts?

Scripts created for automation could be re-utilized for different Hyperion cloud products. Ex. Backup automation for PBCS application in Windows as well as Linux can be utilized for other modules. Data Load scripts along with Data Management automation can also be re-utilized for different modules of different projects.

Operating System Supports and Limitations:

This utility is compatible with Windows as well Linux/UNIX Operating System and the setup can be downloaded from Simplified Interface

  • EPMAutomate utility works on 64-bit Operating Systems

  • Corporate SSO credentials cannot be used with EPMAutomate utility and end-user profile for using EPMAutomate required to be created in the cloud.

HAS HFM NOT REALLY MOVED TO CLOUD??

Has HFM not moved to cloud?

1.png


As the name indicates, FCCS - Financial Close & Consolidation Cloud Service, but still after reading through multiple blogs, what I interpreted is that FCCS is not HFM in cloud or a replacement of HFM. However, the list of features included in FCCS does support the fact that it seems to be a distant if not closer relative of HFM, if not a complete package altogether.

Let's first have a look in detail on all the features, which I found to be similar to HFM. And then I would go on covering the variances as well.

Feature

HFM

FCCS

Standard Dimensions

x

x

Standard Consolidations & Eliminations

x

x

ICP Matching reports

x

x

Currency Translations & CTA Calculations

x

x

Cash Flow

x

x

Multi GAAP Support

x

x

Drill through

x

x

Journal workflow

x

x

Data Audit

x

x

Data load from ERP

x

x

 

 

 

 

 

 

 

 




FCCS is primarily built from the customer's base point of view. The intent of this application is to reduce the manual effort or limited need for customization, since there is no concept of rule file here, & all the functionalities mentioned above have been provided out of the box.

Hence, FCCS would be of greater benefit to those customers who do not require complex calculations & plain vanilla implementation / consolidation is the requirement.

The foremost advantage of FCCS would be the database size for each application which is 150GB & that would mean the application is capable of holding around 1K entities & 5K accounts. However, this DB size is extendible, for a higher price though.

Lesser user roles exist in the shared services which would indirectly mean easier maintenance from security perspective.

Now let us discuss the differences in both the solutions, or rather I must say, the shortfalls in FCCS, which project the fact that FCCS is surely not a replacement of HFM or HFM in the cloud.

Feature

HFM

FCCS

Standard Dimensions

8

11

Custom Dimensions

Up to 20

2

System defined dimension

members

Only in Value

Several

Parent Entity Data Input

Available through very data input mode

Only Journals

Ownership Management

Yes

No

Currency Translations & CTA Calculations

Multiply / Divide available

Only Multiply

 

 2.png

However, after considering the Oracle's future roadmap of FCCS, the day is not too far when HFM & FCCS might become closer buddies as well.

 

July 18, 2017

Big Data & ETL's Evolution

Big Data & ETL's Evolution


The need for Extract Transform Load (ETL) Tools are ever-present as long as data consumption is there. ETL tool has been used in batch processing and transforming data as per the format required by data warehouse. Transformations have evolved into more complex due to enormous growth in the amount of unstructured data. 

At High level, Big Data Hadoop eco system consists of,

  •     Structure data :  High level of organization. Data is stored typically in organized table structure.
  •        Unstructured data :  Data is not stored in any organized form. E.g. data from Social media, Smart phones, Sensors, Images, Emails, etc.
  •        Hadoop (Hadoop Distributed File System) :  Framework for processing/storage of extremely vast data; breaks the data into chunks and stores in the participating node servers.
  •      MapReduce :  S/W Framework for processing vast data on multiple clusters(nodes) in parallel in master (Map task) and Slave (Reduce task) mode.
  •    Spark : Data analytics tool that operates on distributed data sources like Hadoop.
  •     Pig & Hive : Both ease the complexity of writing complex MapReduce programs (Similar to Scripting/SQL but not exactly).
  •    Sqoop : Migrates data in/out of Hadoop and relational data bases.

(Note: some of the above components are optional)


Pic.png

Fig 1. Hadoop Eco System 

Given the growth and significance of unstructured data, there has been increase in need for major ETL players to provide solution options for transforming unstructured data to be used in analytics. Most of the ETL tools in the market are successfully marching towards that path. Here are some of the ETL tools offerings w.r.t Big Data,

Oracle - ODI:                                                                        

The approach of Oracle's BigData is to enable Client's current data architecture to incorporate BigData and help to get more value to business and prospective analytical reporting and enable it to support other big data needs. ODI is important key tool for Oracle in this pursuit. The advanced new Big Data Wizard in ODI 12.2.1.1.0 supports many new Hadoop technologies.


Pic1.png

Fig 2. Oracle Data Integrator


ODI ELT doesn't require middle tier engine for supporting big data components whereas typically ETL tools require intermediate servers to convert the mapping into programming languages like C++ for execution. ODI leverages its predominant feature of using underlying database efficiency for the processing to support big data. ODI ability to produce native code results in tremendous efficiency for the processing can be attained.

Cognos:

IBM has introduced a new suite 'BigInsights' for big data and analytical reporting. BigSQL authorizes Cognos to configure Hadoop as a data source. BigSQL can access Hive, Hbase and Spark synchronously using a single DB connection via Hadoop.

Business analysts and Executives can experience visually enhanced Big data reports from Cognos Presentation service which is a good value addition for understanding Big data. With BigInsights and BigSQL, IBM is providing tools for enabling Hadoop operations, including the ability to exchange components with the existing infrastructure and functionality of Cognos.

DataStage:

IBM platform for DataStage has engineered an easy integration service of heterogeneous data, including big data at rest (Data is stored and analyzed. E.g conventional data warehousing) or big data in motion (Dynamic data based on Real-Time or operational intelligence architecture. E.g Trading, Fraud detection, etc.). 

DataStage, in its newer versions, now includes components such as new Big data file stages to access files (both read &write) from HDFS, Hive stages or has Stages to automatically generate MapReduce program.

Talend Studio for Data Integration:

Talend Data Fabric solution delivers high-scale and in-memory fast data processing. To generate native Spark and MapReduce code, it leverages Hadoop's parallel environment property.

Since Talend Open Studio is an open source solution it can be downloaded at no cost, but support will be provided only for subscription products. Subscription products has more functionality like shared repository, versioning and dashboards.

PowerCenter Informatica:

Informatica Corp launched Informatica BigData Edition which can be used for ETL in Hadoop environment along with RDBMS. Informatica BDE is available in versions 9.6 and later.

BDE runs in two modes, Native mode for normal power center ETL and Hive mode to support BigData additionally. Mappings moved to Hive will be executed in Hadoop cluster using Hadoop's parallelism (By MapReduce cability).

SQL Server Integration Services (SSIS):

Microsoft has new Visual Studio 2015 tools which contains new SQL Server Integration Services (SSIS) Tasks. This provides ETL options on Apache Hadoop, Sqoop for data import/export, Hive for SQL queries, the MapReduce distributed programming infrastructure and ODBC drivers to connect to your data in HDFS from tools like Excel and SQL Server.

JaspersoftETL:

Jaspersoft amended OEM agreement with Talend to use native connectors to Apache Hadoop Big Data environments in Jaspersoft ETL. Also Integration of Talend into the Jaspersoft BI Suite, supports all Big Data use cases.

                Talend supports major Big Data platforms including Amazon EMR, Apache Hadoop (HBase, HDFS, and Hive), Cassandra, Cloudera, etc. For the robust performance and reliability, Big Data Edition has high availability and load balancing features for critical reporting and analysis requirements.

List of ETL Big Data Solutions Vendor-wise:

 

Big Data

Big Data in Cloud

ODI

ODI for Big Data

Oracle Data Integrator Cloud Service

Cognos

BigInsights Suite

IBM BigInsights on Cloud

DataStage

Native BD file stages.

IBM Bluemix - IBM InfoSphere DataStage on Cloud

Informatica

Informatica Big data edition BDE

Informatica Big data edition BDE

SSIS

SQL Server Data Tools for Visual Studio 2015

Azure Data Factory

Talend

Talend Big Data Integration platform

Talend Integration Cloud

Jaspersoft

Talend native connectors

Amazon Redshift

            

-  Xavier Philip

July 17, 2017

Empower with Hyperion on Mobile

Mobiles and tablets have transformed today's world, enabling access to information and fast decision making by a mere touch. Oracle Enterprise Performance Management made entry in mobile space with release 11.1.2.3.500 providing access to Hyperion application through handheld devices (Mobiles & Tablets). Access to Hyperion on mobile devices can be broadly categorized in two ways -

  1. A dedicated app on Android & iOS platform for enabling on the go reviews, approvals and workflow.
  2. Browser access through URL for Hyperion Planning artifacts (post configuring Tablet access in workspace) and Hyperion Financial reports.
Let us first talk about EPM mobile app -

EPM Mobile App

Installing app on your mobile or tablet is pretty much straightforward, visit App Store or Play store and download the app (talking about fast changing world, App is not compatible with latest iOS 10 & Android 7 yet ,but that is a story for another day).

Primary usage of this app is to enable on-the-fly reviews and approvals for -

·   Oracle Hyperion Planning
·   Oracle Planning and Budgeting Cloud System
·   Close Manager
·   Oracle Data Relationship Governance
·   Oracle Hyperion Financial Management
·   Oracle Hyperion Tax Provision

After installation, open the app and options are available for -

  • Configure Connection - Enter Details for connection URL and credentials.
  • Product Tour has demo to familiarize with interface. After clicking product Tour, it shows all available applications for different Hyperion products - Hyperion Planning, HFM, Tax Provisioning, Data relationship Governance & Financial Close Management.

Hyperion Planning

Select Scenario and Version for which workflow has to be reviewed.  There is also an option to filter by Status (Not Signed Off, Under Review etc. )

                                                             

Select the Scenario, Version , Entity combination to view History and Promotional path details for Planning Unit

                              

Click on Actions button to appropriate Action to move planning unit in Promotional path.        

HFM

Select HFM application from List of available applications. Promotion units are displayed for Scenario, Year Period combination. Promotion Units are grouped by Phase & Review Level. Filters are available to select appropriate Promotion units form Scenario, Year, Period, Phase, Review Level. Pass /Fail Staus and History can be viewed .  

Financial Close Management

Close Manager displays all the available close tasks, after Opening due task, take action to Approve or Reject task.  

Data Relationship Governance

Data relationship governance was not part of the first release of EPM Mobile application. It has been updated with latest release. All available requests in DRG application Requests can be seen, after selecting an item , click on Action button to take a decision on the request .

                                                  

 

Browser Access

Before accessing application in tablet, Tablet access needs to be enabled in workspace after logging through a Desktop/laptop. Go to Administration > Manage > Tablet Access.

Select artifacts from list Forms, Task List and Rule which should be enabled for Tab access. (Keep in mind that display of Forms might vary as per the resolution properties and screen size of tablet)

From browser of tablet, access Planning URL http(s)://server:19000/HyperionPlanning .Enter Credentials. You can see the forms, Task lists and rules

                                                                      

 Forms

                 

 

Task Lists

               


Business Rules

             https://lh4.googleusercontent.com/-Am2xyiVwm5c/U0Q7tWHSDyI/AAAAAAAAGnc/utlDm9na5pA/s640/blogger-image-328687417.jpg

 

While accessing reports from icon available , only Report Snapshots can be accessed. For Accessing Financial reports visit URL  http://server:19000/hr/mobile/HRMobileLogon.jsp  .  Enter Credentials ,after successful login, Folder structure for reports can be accessed

A sample report viewed on Tablet

http://3.bp.blogspot.com/-hJzAnLK56Nk/U13Qqt4ltRI/AAAAAAAAAlM/S8-L4bg9Fks/s1600/08-03.PNG


Note:

  1. Refer Mobile Certification Tab for Compatibility details in Oracle EPM Compatibility matrix.
  2. Depending upon deployment, VPN might be required to connect to the application.



Simplified Interface - Life made simple for Hyperion Planning

·   Can we add Company Logo?

·   Can we change color theme?

·   Our management wants to access EPM application on Tablets, how can we do it?

·   How can we make scheduling jobs easier?


As a Hyperion planning Consultant these used to be a common queries from clients, and for all above queries there was a workaround (not simple at times), but with 11.1.2.3.500 release of EPM an alternative to vanilla user interface was made available - Simplified Interface for Hyperion Planning. Its focus was to enable applications access on Tablets.

It provided an appealing user interface for administrators to create and manage an application, and for planners to budget, analyze and review. Icon based UI makes working with Hyperion planning much easier. It was also a glimpse of changing User Interface for Oracle EPM Hyperion suite from On Premise to Cloud.

 

Simplified interface provides -

·   A better look & feel

·   Options to customize the interface (logo, theme, watermark), formatting on data forms.

·   Enhanced performance when navigating forms and entering data, now it uses client-side scripting technologies in the simplified interface.

·   Flexibility to add Company logo on Home page & option to add watermark background.

·   Better Job scheduling options

·   Create Dashboards which include forms, charts, external link and commentary.

·   Tablet friendly.   

  

Enable Simplified Interface

Simplified administration has to be initialized from planning administration.

For a new application, there are three options available for creating applications

1. Sample - Vision application is created with artifacts and data.
2. Simple - Custom application with 1 Plan Type, allows MDX member formulas only.There is no provision for business  rules, map reporting, copy data, copy version, exchange rates, or currency conversion. This is useful for creating a simple application and can always be converted to an advanced application while scaling up.
3. Advanced - Member Formulas and Business rules can be created, it cannot be transformed into a simple application.

After application creation , logging to Hyperion Planning application a new layout is visible