Infosys’ blog on industry solutions, trends, business process transformation and global implementation in Oracle.

« September 2017 | Main | November 2017 »

October 31, 2017

Oracle EBS to ARCS in one click - On Premise to CLOUD Automation made possible


If you are manually loading your data from source system to Oracle reconciliation system then this may be a cumbersome process which might include receiving the data files on email or on a shared folder, converting them to ARCS readable format, loading them to data management folder and running the data load.

We will talk about a "One Click "automated solution for this which we have implemented at several of our ARCS/ARM customers and also leveraged same for integrations of other Cloud modules


What to do?

Let's consider a common scenario where EBS is the source and ARCS is the target.

To automate the data load from source system to ARCS, below procedure can be followed:

  • Write a program in EBS (can be written in PLSQL), one for each data file, to run queries to generate the source data files. The program can be written to generate .csv or .txt files based on the import format configuration in data management. (Plan to keep the source data files to a shared drive through the same program). Data management comes along with ARCS and is used to import data into ARCS through data files.
  • For automating the ARCS manual tasks, EPMAutomate comes to the rescue. EPMAutomate is an Oracle provided utility to perform administrative tasks using REST APIs within Oracle EPM cloud instances. So, prepare an EPM Automate batch file to have
    • Commands to upload the source data files from shared drive to data management inbox folder. In this case, the "epmautomate uploadfile" command can be used (make sure that the user id used to login to EPMAutomate has access to the shared folder).
    • One command to run the data load in ARCS. In this case, "epmautomate importbalances" command can be used (make sure that the data load definition has been created in ARCS already and the definition has all desired locations selected).
  • When the batch file runs, data load, based on the definition, would trigger the underlying data load rules to run sequentially in data management.
  • The programs in EBS and the EPMAutomate batch file can both be scheduled to run daily/bi-weekly/weekly as per the requirement.


Below are our metrics from one such automation implemented.

Extract source data files from EBS through automated programs - 20 mins ; Reduced by 50% ; Manual effort of 'extracting the source data files one by one, converting each to .csv file and sending all files through mail' eliminated.

Data load to ARCS - 25 mins ; Reduced by 50% ; Manual effort of 'downloading the source data files from mail to local machine, uploading each file to data management inbox folder one by one, selecting the appropriate data file for each data load rule one by one in data management and running the data load from ARCS' eliminated.

 

Benefits -

No Manual Errors and high quality data loads!!

No dependency on availability of Expert!!

Better Performance!!


Unravel the mysteries around Equity Overrides

In this blog, I would like to share one of my experiences with HFM translation rules; it is still engraved, because it was a major issue that I had faced in the initial days when I had just started playing around with HFM & was keen on unravelling the mysteries that revolve around translation issues. During one of the implementations, I had a brief understanding of the historical rate concept, which we usually encounter to translate a subsidiary's equity.

So, before I proceed with the problem statement & the resolution, let me define historical rate for the beginners (especially who do not have a background in Finance, just like me) in the field of HFM. Historical rate is the exchange rate which was prevailing at the time of the transaction consummation. So, these transactions (mainly Investment & Equity) have to be translated at Historical Rate rather than using EOM Rate or AVG Rate. This is usually coming from clients who are reporting in multiple GAAPs, or I must say, US GAAP particularly.

Let me describe the issue in a practical example:

Org A invests USD 1 million in Org B, which reports in AUD, at an exchange rate of 0.7, hence making the subsidiary receive AUD 130,000. In A's books, the investment is USD 100,000; while in the books of B, there will be equity of AUD 130,000.

Now, in future, suppose the exchange rate becomes 0.8. Here, the translated equity for subsidiary, i. e, Org B becomes USD 104,000; whereas, for Org A, the investment still remains, USD 100,000 in the books. Hence, at the time of elimination the plug account will capture an imbalance of USD 4000, which actually is coming in due to the incorrect exchange rate being used for translation of the transaction. And the actual transaction is nowhere to be blamed for the mismatch. Hence, there is an urgent need for some solution to report the correct investment & equity in B's books, or else, the reported values would be incorrect.

Now the first thing, that struck me was, why don't we capture the translation through a rule which would take care of only the changes in equity & differences in A's investments during the entire month. Hence, this would automatically be taken care by standard EOM Rate, i. e, Balance Sheet rate, which is pre-defined in the application. But there was a gap here, suppose A invests in B on the tenth working day of the month. At his point of time, the rates are quite different.

Hence, the solution revolves around using Equity Overrides. But how to achieve this was another big hit. This would benefit the users, by rendering complete hold of the authentic translated values that were required to be shown in the Balance Sheets of the subsidiary organization.

We must be manually capturing the Historical rates for conversion through a separate Data Form defined in the application. The values then be translated using the historical rates using the rule file, overriding the actual rates. The difference arising  would be captured in a separate account, which we refer to as a Plug account for currency overrides, i. e, Foreign Currency Transfer Adjustments (FCTA)



October 30, 2017

Facing Authentication issue for ADLDS using ActiveDirectoryAuthenticator

 

In few Projects, Uses Active Directory Lightweight Directory Services (AD LDS), formerly known as Active Directory Application Mode (ADAM) and used to setup external LDAP for authentication in OBIEE.

In one of our project, we had scenario to configure LDAP authentication using Active Directory Lightweight Directory Services (AD LDS) in OBIEE. We have tried to configure LDAP type as ActiveDirectoryAuthenticator and faced issue to authenticate application. Sharing my experience below when configured LDAP as ActiveDirectoryAuthenticator and solution to Resolve issue.

ActiveDiretoryAuthenticator as Authenticator:

When configured ActiveDiretoryAuthenticator in Weblogic to access Active Directory Lightweight Directory Services (AD LDS) for authentication, authentication error is getting for valid user id even though AD LDS users and groups able to see in security realm.

Error:

The ActiveDiretoryAuthenticator uses attributes which are incompatible with the Active Directory Lightweight Directory Service (AD LDS). Specifically, the ActiveDirectoryAuthenticator uses the attribute User-Account-Control (UAC), which is used in the full version of ActiveDirectory, but is not used with the lightweight version. Therefore, the default ActiveDirectoryAuthenticator cannot be used with AD LDS.

Generic LDAPAuthenticator as Authenticator:

Microsoft ADAM is a lightweight implementation of Active Directory which does not provide all the services of the complete Active Directory provider. So we should not be using the Active Directory Authentication Provider to configure ADAM.

Solution for above issue to use the generic LDAPAuthenticator with AD LDS instead of using the ActiveDirectoryAuthenticator. Unable to start Admin server after configuring generic LDAPAuthenticator and getting below error.

Error:

Solution:

JPS will not support the generic LDAPAuthenticator by default, which will result in the server startup failure issue and JPS Error.

Add idstore.type property in the jps-config.xml file located under <domainhome>/config/fmwconfig to support the Generic LDAPAuthenticator to integrate WLS with the AD LDS Server and restart Admin Server service.

 

Note: take backup of jps-config.xml before modifying.

Analytics and the APP!

 

Welcome Back!!! In parts 1 and 2 we started out to understand the concept of analytics and the app (or analytics on a mobile platform) and review a few case studies from different leading products - Kronos, Oracle, and SAP. In this concluding part we will look at the significance of these case studies and draw inferences as to how they impact the world of analytics...

 

Inferences:

 

We have seen 3 case studies across different verticals with varying background and use case scenarios. However all have the common feature of using an analytics tool on a mobile platform to showcase the versatility of this combination of Analytics and the App!

 

When organizations go mobile with analytics, they are able to extend the reach of information and empower people from every aspect of their business with the facts to make better, more informed decisions.

This is evident from the 2015 Mobile Business Intelligence Market Study by Dresner Advisory Services:


  • Mobile business intelligence (BI) is the third most common use case for mobile business workers, next to e-mail and contact information

  • 90% of study participants said mobile BI is important to their business

  • In a surprising find by a Dresner market survey (*) Business Intelligence is of the 3rd  highest priority in Mobile applications, ranking higher than social media and even personal banking, coming in below only email and basic phone services.

 

*SOURCE - Wisdom of Crowds ® Mobile Computing/ Mobile Business Intelligence Market Study 2015

 

Trends observed during the research on case studies indicate the growing importance of Mobile analytics in different verticals - IT being the prominent horizontal across most of the industries. Some of the reasons for this are listed below:


  • Exploding usage of 'smart' mobile devices in general - personnel, org-wide, technological leap

  • Growing use of BYOD among enterprise employees - personnel get more opportunity to tap into the client systems and data as organizations open up accesses to employees.

  • Rapid use of mobile devices for other aspects of daily life - communication, mails, social media, entertainment - to make a convenient platform for including analytics.

  • Flexibility of usage and availability on-the-go. From being a straight-line process to being agile.

  • Advanced functionality of apps and devices - inducing enhanced parts and software.

  • Technology growth to aid predictive analysis and user data customization.

 

Suggestions/Future Prep: 


  • It is seen that the concept of mobile analytics is well known but almost negligible in application. This could be leveraged further to achieve Customer Delight.

  • The analytics functionality on ERP systems remains a niche area. Consultants could be empowered with training on this module to also include the mobile apps that are usually pre-built for such applications.

  • Another option to be explored would be provision of sample tablet devices (i-pad or android) to respective practices so as to enable learning, hands on and PoC processes.

  • From the case studies and also from project experience, it is observed that even though customers may be aware of the implications of mobile analytics on their processes, a PoC is helpful in all cases to create the right focus to open up further avenues of engagement.


Conclusion:

 

The advent of the mobile platform has been another epoch making event, probably making it to the top 20 inventions/events that changed lifestyles across the world significantly. Added to this event, parallel advancements in related areas like data analysis, cloud computing, big data, to name a few have been instrumental in converging the big with the best, giving rise to a concept such as mobile analytics. Since this concept is still in its nascent stage, it provides great potential for further exploration to discover the myriad use case scenarios and adaptability, which could lead to several success stories of - Analytics and the App!


 

End of part 3...hope you found this interesting - Please do leave your valuable feedback!

Part1 :  http://www.infosysblogs.com/oracle/2017/09/analytics_and_the_app_1.html

Part2 :  http://www.infosysblogs.com/oracle/2017/10/analytics_and_the_app_2.html

 

Analytics and the APP!

 

Welcome Back!!! In part 1 we saw an example of analytics being used on a mobile platform - tablet - to realize the retail store objectives and gain advantage of real time data updates. In part 2 let us take a look at more case studies across similar leading products...

 

Case Study 2:

 

Scenario - The client is a US based fortune 500 energy and utilities company with approximately 10 million customers. Their integrated business model provides a strong foundation for success in this vertical which is experiencing dramatic changes. They strongly believe in capitalizing on emerging trends and technologies to provide insight into operations that will drive value for their end customers

 

Background - The organization uses Oracle - one of the top ERP applications for their myriad business processes. As part of this PoC the Infosys team setup custom analytics solutions for the client. Oracle's business tool OBIEE 12c is used here to showcase the length and breadth of the analytics tool available as part of the wide array of modules in Oracle.

 

Problem Statement - The client needed to do a comparative evaluation between two mobile analytics applications as part of their PoC to be reviewed by their senior management.

 

POC details - The PoC was aimed at the OBIEE module's ability to work on a mobile platform. The PoC also aimed to do a comparative demo of features between Microstrategy (another analytics tool) and Oracle tools (apps). A set of commonly identified features was expected to be compared and in most cases, the feature was available within these tools but the enablement of the feature was different between OBIEE and Microstrategy.

 

Pilot & Feedback - For the pilot, the app was shared only among the senior management in the organization. The focus group was impressed to see that OBIEE could provide the features needed and appreciated the way it is achieved in OBIEE, which was different from their current applications. Further using OBIEE on mobile presented a very unique but scalable scenario as it proved to be a seamless extension to the existing suite of oracle products and which meant lesser chance of data integrity issues. Post the successful demo, client is now evaluating an option of a complete migration to OBIEE with preference to the analytics app as it aligns successfully with their established principles.

 

Being an energy and utilities company, it is always essential for the organization to possess the latest status and forecasts in a rapidly changing environment with unpredictable trends. With the analytics tool on mobile, it has brought the leadership very close to data and trends that were hitherto not feasible. Management can now make an informed decision much faster and just as easily track the results through OBIEE. Also, the time and effort saving is huge since it allows the stakeholders to pull their own graphs and data analysis, first hand and without chances of error. As the gap between technology, user and data/solution is greatly reduced leadership is also now very keen on applying this model to other areas of analytics.

 

Case Study 3:

 

Scenario - The client is a global tool manufacturing corporation with interests in construction, heavy equipment and technological solutions. They excel through outstanding innovation, top quality, direct customer relations and effective marketing. Client also has their own production plants as well as research and development centers in Europe and Asia. They pride in integrating the interests of all their partners, customers, suppliers and employees - into their growth and sustenance strategies.

 

Background - The client uses SAP package and tools for running their analytics platform integrating the various aspects of their business from planning to customer feedback & support. Combining areas like technology, design, real time feedback and automated order processing and metrics like quantity, geographical location, customer database, the analytics tool (SAP's BI system), provides the necessary inputs to the stakeholders to catchup on the latest available information/trend.

 

Problem Statement - The client needs an on-the-go platform to deploy their analytics solution to enable salesforce and service personnel to meet client demands as and when they arise in an automated fashion.

 

Introduction of Mobile Analytics - The organization has about two-thirds of its workforce employed directly for their customers in sales organizations and in engineering. They average about 200,000 customer contacts every day. This entails a constant need to be up to speed with the latest and greatest as regards the end customer data (or detail). A ready reckoner for this situation is the SAP mobile analytics (ROAMBI as it is known otherwise), that most employees in the organization use on a daily basis. Further, the entire solution is a cloud based model, so they have the best of both cases - cloud computing and mobile application. This has proved to be very advantageous to their on the job salesmen, technicians, customer support or even the top executives discussing an org-level strategy.


A real-life scenario involves the following situation:


  • A critical time bound customer order is not received at site on time.

  • However, the automated tracking tool, looking for the delivery report, has sensed it and raised an alert to the support center of the tools manufacturer.

  • This triggers the next set of established workflows in order to compensate for the delay in delivery.

  • Alerts sent to the nearest customer support personnel through a geo fencing feature enables the employee to locate the nearest outlet/warehouse/distribution center for the right part.

  • The support person raises a request under the right priority and is able to head over to the site to personally supervise the final delivery

All this has actually taken place on-the-go using the mobile device loaded with the BI tools and supporting applications to augment the corrective actions.


In this particular scenario, even the customer delight can be captured on the same mobile as feedback and, back at the corporate HQ, the top management will be able to gauge a real time heat map/graph showing customer satisfaction survey results that have been processed seamlessly through cloud.

 


End of part 2... in part 3 we will review the inferences and conclusion.

Part1 :  http://www.infosysblogs.com/oracle/2017/09/analytics_and_the_app_1.html

Part3 :  http://www.infosysblogs.com/oracle/2017/10/analytics_and_the_app_3.html



October 29, 2017

Migrate Oracle Hyperion Cloud Applications(PCMCS) on Autopilot!!! Using EPMAutomate


Migrate Oracle Hyperion Cloud Applications (PCMCS) on Autopilot!!! Using EPMAutomate


What is EPMAutomate?

EPMAutomate utility helps in automating administrator's activities for EPM Hyperion cloud products.


What is Migration and Why it is required ?

Migration of application in cloud is required to move an application from Test instance to Production instance and vice versa. Manual migration of application across the instance could take hours, it can be automated using EPMAutomate utility which subsequently reduce the time from hours to minutes. Migration of application includes Data, Metadata, Security, Data Management, Reports etc i.e every artifacts of application will be migrated using EPMAutomate utility without manual intervention. Migration can be server to cloud or cloud to cloud. It is always preferable to move backup artifact from server to cloud. Here example has been demonstrated with respect to PCMCS application .

Migration methods:


  1. Server to Cloud

  2. Cloud to Cloud


  1. Steps to automate Server to Cloud Migration from daily backups process using EPMAutomate utility in PCMCS


  1. Login into PCMCS Workspace by entering Identity Domain, User Name, Password and Click on Sign in



 

2. Delete existing application from instance if available, where new application will be migrated and imported from another instance.

Click Application->Application


3. Click on 'X' to delete application and click Yes


 

4. Now, modify the attached reusable sample batch script with relevant url and user credentials to automate Migration process using EPMAutomate utility

Sample Script:


@echo off

rem This script is used to perform On-Demand migration of Artifact Snapshot using

rem server backups i.e server to cloud migration

rem Update the parameters: url (Target url), Folder (source folder) and SnapshotName as per requirement

rem Please make sure application has been deleted from target instance before importing snapshot into it

SET url=

SET user=abc

SET password=D:\Oracle\password.epw

SET IdentityDomain=

SET Folder=D:\Oracle

SET SnapshotName=Artifact_Snapshot_09_13_2017_1322

SET UploadedSnapshot=%SnapshotName%.zip


call epmautomate login %user% %password% %url% %IdentityDomain%

timeout 10

call epmautomate uploadfile %Folder%\%UploadedSnapshot%

timeout 8

call epmautomate importsnapshot %SnapshotName%

timeout 10

call epmautomate logout           



5. Trigger .bat script :

Uploading relevant snapshot to cloud


 

 

6. Once migration completes, check for migration report in in PCMCS workspace.

Click Application->Migration->Reports->Migration Status

  

B) Steps to automate Cloud to Cloud Migration from daily backups process using EPMAutomate utility in PCMCS



  1. Follow steps 1 to 3 from section A

  2. Attached script to migrate artifact from one cloud instance to another

     

    'Copysnapshotfrominstance' command is used to move artifact snapshot across instances in cloud 

     

    Sample Script:

    @echo off

    rem This script is useful to migrate Artifact Snapshot from Test to Production instance 

    rem Update the following parameters based on requirement

    SET url=

    SET FromURL=

    SET user=

    SET password=D:\Oracle\password.epw

    SET IdentityDomain=

    SET SnapshotName=Artifact Snapshot

    call epmautomate login %user% %password% %url% %IdentityDomain%

    timeout 10

    call epmautomate copysnapshotfrominstance %SnapshotName% mdeshmukh %password% %FromURL% %IdentityDomain%

    timeout 8

    call epmautomate importsnapshot %SnapshotName%

    timeout 10

    call epmautomate logout


3.Rest of the steps are similar as in Section A

Reference(s)

https://docs.oracle.com/cloud/latest/epmcommon/CEPMA/epm_automate_command_links_pcmcs.htm#CEPMA-GUID-6BC610D3-03F0-41C4-8C52-FA1EE972D03F



 


 


October 27, 2017

Replace Query tracked Aggregate views of an ASO cube to another cube and aggregate

Essbase Aggregate storage Database is a Multidimensional Database. Aggregate storage enable dramatic improvements in database aggregation.

Aggregation:
Aggregation is a process of calculating and storing data of aggregate views to enhance retrieval performances.
For an ASO cube we can perform Default aggregation and Query tracking aggregation.
Aggregate view:
Aggregate view is a set of higher level intersections of dimensions in a ASO cube.

Aggregate views could be seen using Design Aggregation Wizard or by a maxl command:
query database AppName.DBName list esxisting views;
AppName is the application name and DBName is the essbase cube name.
Views will appear as below

View ID Viewlevels outline ID
0 0,0,0,1/2 1
2 0,0,2/3,0 1

View ID - is a numeric identification of a view
View levels - combimnation for each aggregate view
Outline ID - is a numeric ID of an aso outline associated with a view.

Query tracking aggregation for Essbase ASO:
Query tracking is the process to track cube usage patterns(aggregate views) and allow for aggregations which will be more efficient to the user's query patterns.
This blog explains how to capture and aggregate the views on a ASO cube using the view file that is tracked on another ASO Cube.

Tracking Queries:
Enable query tracking on the database prior to usage of cube for reporting. It could be done in two approach.
1. In Essbase administration console, select the essbase cube on which you have to track queries, and right click on Database and enable query tracking.
2. Use the maxl command, alter database AppName.DBName enable query_tracking;

Generating view file: 
Force dump the tracked usage patterns (Queries) to a File by giving appropriate file name that is not more than 8 characters length.
This could be done using Design Aggregation Wizard or by a maxl command, execute aggregate selection on database AppName.DBName based on query_data force_dump to view_file SAMPLE;

View file structure:

Number of views
Outline ID
View ID - default level 0 view
View size
View ID1
View size1
View ID2
View size2
.
.
View IDn
View sizen

Consider aggregate view file below:
5
2984637188
0
1.000000000000000
3773467
0.000000383467902
3
0.289642138867758
8
0.686095536512878
11
0.115267836885590

Here in this view file line 1 represents total number of aggregate views (In the above example 5 is total number of views).
Line 2 represent outline Id, line 3 represents default aggregate view (i.e., the view that is created when all the dimensions are kept at lower level), line 4 represents view size.

Add views of one DB to another:
To aggregate / to add views from DB to another, the outline (Total number of Dimensions and the hierarchy) should be same.
Here in this case we have chosen SAMPLE1 and SAMPLE2 both having the same outline structure.
Consider the below view files,

View file of SAMPLE1 cube: SAMPLE1.csc

2
2984637188
0
1.000000000000000
3773467
0.000000383467902

View file of SAMPLE2 cube: SAMPLE2.csc

4
2984899332
0
1.000000000000000
3
0.289642138867758
8
0.686095536512878
11
0.115267836885590

Adding view from one(cube) view file to another(cube):
Retain the Outline ID from SAMPLE1 file, add three views (View ID: 3, 8, and 11) and update the total number of views in the file. Below will be the new view file having the views from both SAMPLE1 and SAMPLE2.
SAMPLE1.csc
5
2984637188
0
1.000000000000000
3773467
0.000000383467902
3
0.289642138867758
8
0.686095536512878
11
0.115267836885590

Aggregating Essbase data base using the newly created view file:
Place the modified view file in the Application folder, and select the view file the view file(modified) for aggregating using Design Aggregation Wizard.
Or use the below maxl command to aggregate
execute aggregate build on database SAMPLE1.SAMPLE1 using view_file SAMPLE1.csc';

Auto Finance - Transforming the Collection Process

Traditionally, auto-finance (or for that matter, any!) organizations start with multiple function specific applications recommended by the technology and business stakeholders who are influenced by current fads and trends, budget, sales pitch, relations etc. With ever changing technology landscape, changes in business processes and new decision makers coming, over the time, multiple systems make their way into the IT Landscape of the company leading to below challenges:


  • Multiple system usage for Collection Process

  • Manual processes impacting the user efficiency and productivity

  • Higher turnaround time towards collection of dues and

  • In-ability to handle large volumes

  • Issues with reconciliation of data across different stages of processes

  • Having consolidated information for reporting and analytics becomes difficult impacting assessment of business process efficiency

  •  Comprehensive information not available with services representatives leading to high response time


 


Any auto-finance organization ought to have a robust solution to support the Collection processes. Agnostic of technology, the system supporting the Collection Process ought to provide:


  • Capability For Agents To Contact Customers To Execute Collection Activities

  • Tracking The Outcome

  • Viewing Work Summary And Account Information

  • Manage Customer Preferences

  • Record Account Conditions Pertaining To Legal Proceedings

  • Skip Trace

  • Repossession And

  • Manage Documentation And Correspondence Related To Account

  • Adherence to government (statutory and state specific) rules and regulations



A verticalized solution offering provides the technology backbone to address the single integrated view problem and is customized for specific business scenarios.


 


 

Irrespective of the underlying technology like Microsoft Dynamics, Salesforce etc., the basic architecture of Collection Process system is as bellows:

 


Efficiency and effectiveness of Collection processes in Auto-captive Finance directly impacts the bottom line of the organization. Identification of delinquent accounts at an early stage and handling them using appropriate collection activity can avoid charging off of the loan at a later stage. Proper automated processes can enable:


  • Improved penetration into workable collection account population in alignment with the strategic vision

  • Single system for assigning and working on accounts & monitoring the progress / updates from External systems / agencies

  • Increased Agent efficiency due to process enhancements and availability of additional information

  • Higher customer satisfaction due to better First Call Resolution supported by additional data availability

  • Less Manual errors due to process automation

     

    Case Study

  • A large auto captive financial services company based out of North America proposed to redesign the collection processes to leverage maximum output using CRM application. The existing CRM application being used by Collections department was conceptualized and designed in last decade and since then; there has been a huge growth in the portfolio and change in the market dynamics. In order to address these changes, redesign of the Collection processes and application was pursued. The platform has resulted into annual benefits in tune of USD 1.1 million with following benefits:


  • Improved penetration into workable collection accounts population leading to increased collection rate by 2%-5%

  • Single system for assigning and working on accounts & monitoring the progress / updates from External systems / agencies

  • Repossession vendor integration: Annual Opex savings estimated to be $230K

  • Reduction in callable volume, leading to lower agent hours and headcount

  • Decrease in outsourcing skip agency fees

  • Availability of consistent transaction history information across systems 

  • Siebel data used by Dialer resulting into more effective collection calls



October 24, 2017

Customizations in BI Oracle cloud - OTBI

 

OTBI (Oracle Transactional Business Intelligence) is a transactional real time reporting solution for Oracle Fusion applications.  Oracle HCM Cloud, CX Cloud, SCM Cloud and Cloud ERP all have Oracle Transactional Business Intelligence (OTBI) built into the application. OTBI leverages advanced functionalities of both BI and Oracle ADF to analyze the data on real time databases for reporting and other BI requirements.

 

OTBI Subject Areas:

OTBI subject areas are pre built meta-data content for BI reporting solutions with seamless integration in Oracle fusion applications. OTBI organizes BI reporting data elements such as Dimensions and Facts based on the business function in RPD Subject Areas. Subject areas are made up of logical data structures combined together and each of them are associated with specific application modules (such as General Ledger, Payroll, Payables, etc.).

Since OTBI analyses and reports on the real time transactional data, user will notice "... Real Time" suffixed with most of the RPD subject area names. Also, by default, caching is disabled in OTBI environment in order to support real time reporting. Even though cross subject area reporting is possible with OTBI, it is not recommended in view of report performance using real-time data.

 

OTBI Security:

OTBI inherits Cloud apps user roles and security profiles. Which OTBI subject areas and what data are accesible is determined by Oracle user security profile. Also the data that can be viewed will be based on the transactional database schema with user level data security applied.

 

Customizations in OTBI:

 Oracle allows the following customizations in OTBI as given below.

 

·         Create custom subject areas (Only for sales cloud)

·         Oracle Flexfields



Flow.png

Custom subject areas:

             Users can create Custom subject area for building reports using Application Composer. Similar to on premise OBIEE subject areas, custom subject areas in fusion apps can consist of entities, columns and measure columns as per the business requirements and domain of the Client business. Users can group components from prebuilt subject area to simply form a custom subject area.

 Creator of these subject area can decide the list of columns, Data leveling and Security of these subject areas. The published CSA can be inactivated or activated later on, based on the business needs. CSA will be listed with other predefined set of subject areas in the analysis creator list.

 Published custom subject areas can be edited and then republished whenever changes are required. Modifying a custom subject area will not impact the reports created on top of it unless specific columns used in the reports are modified.

Custom subject areas are secured by access rights based on the fusion apps user roles. For the role name added, read access will be granted by default. If read access needs to be revoked, "No Access" has to be selected explicitly.  

 

Note: Oracle facilitates custom subject area creation option only for Oracle fusion sales cloud as of now. Unlike the traditional OBIEE, modifying the predefined RPD subject area is restricted in OTBI.



Flexfields:

Flexfields in Oracle Cloud are placed holders' fields to hold key information, which can be preconfigured to capture additional information on the page or fields and are configurable during the implementation. Regardless of type of flexfield, they have similar structure. Flexfields are preserved during the application upgrades also.

Flexfields are formed by sequenced segments. Segment has value set which stores the values allowed to be used in the segment. Value set stores the data from transactions. Flexfields are integrated with the fusion application's Transactional Database. So security is governed by Oracle user security profile.

All the Flexfields which required to be available in the BI subject areas need to be checked on the "BI Enabled" property. RPD Subject areas for these Flexfields will be different based on its type, like GL related Flexfields can be found only in General Ledger subject areas. Users can customize and configure on which presentation table of specific subject area, these new Flexfields columns to be listed.  Upon publishing of these flexfields, it will be available for analysis creation as other pre-defined columns.

Some of the Fusion Applications' columns are not exposed for analysis or reporting purposes. Only common business process related columns are exposed and pre-defined analyses and reports are built based on that. For additional customizations over and above discussed here,


·         Users can reach out to Oracle through Support Request for customizations.

·         Additionally, BI Publisher can be used to build SQL statements based data model for report building. SQLs can be executed on the Fusion Apps database which requires knowledge of the underlying fusion database but allows more control over data reporting.


- Xavier N Philip

October 20, 2017

Multi-Currency Reconciliation made easier in ARCS

 

Account Reconciliation Cloud Service (ARCS) is a tool that helps streamline a company's account reconciliation process by enforcing standardization and operational efficiencies through centralized repository of account reconciliations across all the segments and accounts.

 

ARCS supports both single currency and multi-currency environments. Single Currency setup is common in ARCS and hence, we will talk about Multi-Currency environment setup.

 

Enabling an environment for multi-currency is done using the currency buckets setup options. Three currency buckets are available: Entered, Functional, and Reporting.

  • Entered--Currency in which the transactions have occurred (Posted Currency)

  • Functional--Currency in which the company operates.

  • Reporting--Currency in which the financial statements are reported

 

Scenario 1: All three buckets are enabled

Legal Entity is in UK, transaction is happening in Germany and the company's headquarter is in USA. So, the functional currency would be GBP, entered currency would be EURO and the reporting currency would be USD.

 

Scenario 2: Functional and Reporting Buckets are enabled

NON US legal entity and the company's headquarter is in USA. In this case the functional currency would be the same as local currency and the reporting currency would be USD.

 

Scenario 3: Functional and Reporting Buckets enabled for one profile and only Reporting bucket enabled for another

One GL Account associated with one US entity and same GL account associated with one NON-US entity (Assuming that the reconciliations are being performed at account-entity combination). In this case GL Account-US Entity will have only reporting bucket enabled and the latter will have both functional and reporting buckets enabled.

 

A legal entity has to reconcile in reporting currency since the Financial statements are being reported in the this currency, but before that it has to be made sure that the balances are reconciling in the functional currency as well (or both Entered and Functional currency in case both buckets are enabled) as these are being converted to the reporting currency at a later stage for financial reporting. Also, in some cases, reconciling in Functional/Entered currency is required to fulfill the Local Regulations and Statutory requirements.

 

ARCS allows all 3 buckets to be loaded from GL through data integration. The currency rates are loaded after that. While performing reconciliations, the amounts for transactions for explaining the balance are entered in the lowest enabled currency bucket (currency in which the transaction has actually occurred). The functional and reporting currency balances are then calculated based on the currency rates loaded in the system for that month.

 

In case of a multicurrency environment, the Balance Summary for a reconciliation looks like below. It can be considered as a multiple representation of the same balance.

 

 

Stepwise Process:

  1. Enable applicable Currency buckets in ARCS. Tools->System Settings->Currency->Currency Bucket

     

    Enabling multiple Currency buckets in system settings, enables those currencies (Currency Bucket 1 for Entered Currency, Currency Bucket 2 for Functional Currency and Currency Bucket 3 for Reporting Currency) for all profiles by default. If not required to be enabled for all profiles, the setting has to be exclusively dis-abled for each ineligible profile by editing the profiles.

     

  2. Setup Data Integration for the Entered Currency Data files

  • Add Category Mapping for Entered bucket

  • Setup Import Format

  • Setup Location

  • Create Data Load Rule for Entered Category

     

  1. Load data in ARCS.

     

  2. Load Currency Rates in ARCS.

     

     

Challenges:

  • Check with your source system (EBS, Oracle Fusion, etc) admin if local currency and reporting currency set of books are different. Also, check if there are standard reports available for extracting local currency data files.

  • System Restrictions: System does not allow to edit the Reporting Currency field in case of addition of an Amortizing/Accreting transaction.

 

Tips:

  • If there is a plan of updating an existing single currency ARCS environment to multi-currency at a later stage, make sure your data files for existing currency have the currency column. This will avoid extra effort on setting up data integration again for multi-currency data files

  • While adding mapping details under import format for data file, add 'currency row' instead of currency dimension.

October 16, 2017

How to handle Foreign Currency Translation Reserve (FCTR) in HFM - PART I


One of the functionality which gives a tough time to many who work on consolidation and reporting, is calculation of Foreign Currency Translation Reserve or FCTR for short. It's important to understand what is FCTR, how it is calculated and the reporting requirement so it can be properly handled in HFM.

For better understanding and spending appropriate time on the topic I will break the topic in 2 parts.

  • In part I, I will cover the functional part of FCTR and try to give you a simplified version of the functionality of FCTR,

  • while in part II, I will cover how it can be handled in HFM.

(In the current discussion I will refer to IAS21/IFRS requirement. Most of the other accounting standards will have more of less similar requirement with or without some local flavor.)

What is FCTR and why it is required to be calculated?

When a company reports its results which:

  • which includes transactions in foreign currency (Foreign currency can be defined as a currency different than the reporting/presentation currency), also mentioned as FC in further sections.

  • or consolidates results of its branches/subsidiaries/associates or JV's which records transactions in currency different than the currency in which the reporting entity is reporting its transaction

then such transactions are required to be translated in the reporting currency.

These FC transactions might be carried out throughout the year and also there will be some FC balances which are carried forward from the previous years. So on the reporting date the question arises at what rate to convert the transactions/balances? 

As per IAS 21 a foreign currency transaction should be translated at the spot rate as on the date of the transaction. Further, as per subsequent period reporting translation requirement and sec 38, IAS 21 further states that:

  • Revenue and expenses accounts SHALL be translated at average rate.

  • All monetary items SHALL translated at EOM rates. Eg. Cash/Bank Bal, Receivables/ Payables etc.

  • All non-monetary items are translated at the rate prevailing on date of transaction. But for translating to presenting currency EOM rate SHALL be used. Eg Fixed assets, Long term borrowings etc.

The first 2 points are pretty straight forward as the translation/reporting rate and the transaction rate are more or less the same. (Any diversion from this is kept out of the discussion to avoid confusion.)

The issue is when some items are translated in reporting currency at the rate of exchange as on the date of transaction (also mentioned as historical rate) as this rate will be different then rate as on the reporting date (EOM rate). This difference needs to be identified and accounted through the Income statement/Equity.

This difference which arise due to different rates applied to a single transaction/item/account is nothing but Foreign Currency Translation reserve or FCTR. To put in most simple word possible, FCTR or foreign currency translation reserve is the difference between the translated values of any asset/liability at EOM rate and historical rate.

Example: Let us take an example to understand FCTR further.

  • You will observe that in block1 the asset has being purchased in 2 parts. 1000 INR in previous year and 500 INR in current period.

  • The total assets at the EOM rate is 29.56 AUD.

  • While if the same is valued as per the rate on date of purchase the value is 29.90 AUD.

  • The difference .34 AUD is the difference due to change in translation rate.

  • In the example the value of the asset has gone down as AUD has gone weak as compared to INR.

  • So the difference of .34 is FCTR and will be appear as liability in the balance sheet.

I hope I am able to simplify what is FCTR so I can use the above example in the next part to explain how to handle the same in HFM in the next part in my subsequent blog. Please do leave your comment and suggestion so I can incorporate it in the next part.

Until then keep consolidating J!!!!


BICS - A Cloud Offering from Oracle


What is BICS?

We are all familiar with the traditional BI services and our businesses rapidly increasing their investments in the Cloud Services. Oracle to stay afloat with the changing business demand had launched its cloud offering known as BI Cloud service.

Now, the question that will come to everyone's mind is what is Oracle BI Cloud Services (BICS)

Here it is,


"Oracle BI Cloud Services, an eminent cloud offering from Oracle, a platform for creating powerful business intelligence applications, thereby enabling users from the workgroup to the enterprise."


So, is BICS anyhow related to on premise OBIEE? The answer is Yes, BICS is just an evolution of the on premise Oracle Business Enterprise Edition (OBIEE) solution, altogether a new platform built around standard OBIEE.


Features of BICS?

  • Query on premise database and other cloud databases
  • Offers presentation services to create analysis, dashboards and visualizations
  • Embed JavaScript/HTML into Analyses and Dashboards
  • Enhanced data visualization
  • Multiple data loading options with connections to on premise database
  • It can be accessed through mobile without any extra programming


Why Choose BICS?

Oracle's business intelligence technology was brought to the cloud with the help of BICS, without any capital expenses, with less amount of money paid for monthly subscription, and implementation too was quick. BICS has an edge over on-premises business intelligence technology in a sense that it will be the first one to receive new and updated features with enhancements as Oracle will be following cloud-first approach. And, using BICS is advantageous for functional users are its easy self-service administration whereas for analysts they can just upload a spreadsheet from the desktop without any assistance from IT or an enterprise-wide application.


The advantages of BICS over on premise OBIEE are:

BICS

On premise

It does not have features like Agents, BI Publisher

Agents (ibots), BI Publisher features are available

BICS version of RPD does not support physical source aliasing

Supports features like physical source aliasing

Two environments available Pre Prod and Prod

No limit on environments

Does not support LDAP or MSAD

Supports integration with LDAP or MSAD


Is it beneficial to migrate from on premise OBIEE to BICS?

Migration will allow the administrators as well as the customers to enjoy the new features and functionality. But following needs to be taken care while migrating from OBIEE on premise to BICS:

  1. It is recommended to migrate to BICS from OBIEE 11g in a condition when subject areas and reports are not to be disabled on BICS alone but needs to be disabled in on premise OBIEE
  2. In order to match the subject areas and reports in on premise to those available in BICS, it is advisable to upgrade to OBIEE 12c for there are lot many similarities that both possess, such as
  • The version of the repository used will be same
  • Using a single file (with extension .bar) the web catalog, reports and dashboards can be migrated
  • Application roles follow the same naming convention in order to avoid maintaining a separate application role list

Overall, it's a perfect approach for OBIEE on premise users to move normal reporting to cloud in order to make use of new functionality and features, visualization of data, reduction in infrastructure and license costs.



October 10, 2017

An account of Contract based Project Billing in Fusion PPM Cloud & its comparison with EBS Projects

1.      Introduction

One of the most major change in Oracle Fusion Project Portfolio Management (PPM) Cloud compared to Oracle E-business Suite (EBS) Projects is the shift from Project driven billing model to a Contract driven billing model.

This new contract based project billing involves the integration between Oracle Fusion Project Billing and Oracle Fusion Project Contracts. While Fusion Project Contracts takes care of the Contract terms, conditions and billing requirements, the Fusion Project Billing piece handles invoice and revenue generation & processing.

So contract now becomes mandatory in Fusion Cloud to do billing and revenue recognition. In EBS, it was optional to use Project Contracts module and agreements were used in place of contracts.


2.      Contract Structure and its components

A contract in Fusion Cloud is divided into contract header and contract lines.

2.1  Contract Header

The header contains

·         Basic contract information: Contract Number, Name, Description, Start Date, End Date, Currency, Status, Amount etc.

·         Party information: Customer and Supplier and their Contacts, Bill-to & Ship-to accounts etc.

·         Billing Information

The billing information contains bill and revenue plans which is a new concept introduced in Fusion Cloud. These are user defined and contain set of instructions on how to bill and recognize revenue for a customer.

Bill plans are associated to predefined billing methods - Amount (Event) Based Invoice, Bill Rate Invoice, Burden Schedule Invoice, Cost Reimbursable Invoice, Percent Spent Invoice and Percent Complete Invoice.

Similarly, revenue plans are associated to seeded revenue methods - Amount (Event) Based Revenue, Bill Rate Revenue, Burden Schedule Revenue, Cost Reimbursable Revenue, As Billed Revenue, As Incurred Revenue, Percent Spent Revenue and Percent Complete Revenue.

Availability of such wide range of billing and revenue methods makes the billing functionality very strong in Fusion PPM Cloud.

In EBS, the same was done using Distribution Rules - Event/Work/Cost combinations.

 

2.2 Contract Lines

Below the contract header can be one or multiple contract lines. Each contract line can      select its own bill and revenue plan combination. This makes it possible to process invoicing and revenue in different ways by using multiple lines on same contract.


2.3 Project/Task/Funding Association

The association to project/task and funding also happens at line level making it possible to implement the various scenarios like One contract- one project, one contract- multiple projects (multiple lines, one project at each line) and multiple contracts-one project.

In EBS projects, agreements were linked to projects/tasks in different ways like (One Customer-One Agreement), (Multiple Customers, One Agreement per Customer), (One customer, Multiple Agreements), (Multiple Customers, Multiple Agreements per customer).

In Fusion Cloud, funding is not mandatory for Project/Task except Percent Spent case. In EBS, baseline of funding was one of the compulsory steps before one can generate invoice or revenue.

So fusion cloud project billing offers much flexibility as compared to a more rigid project-agreement-funding relation in EBS.


3.      Important New Features

3.1 Bill Set

In Fusion Cloud, an important and new bill set feature on bill plan enables the transactions with same bill set number belonging to different bill plans to be grouped together into a single invoice. So you can send a single summarized invoice to customer.

3.2 Release Invoice Automatically

In cloud PPM, now you can release the invoice automatically by simply enabling a checkbox at the contract header level. So the invoice gets created in released status directly. In EBS, this was not a standard functionality and Automatic Invoice Release Extension would need to be written.


4.      Contract Approvals

The contract approvals can be configured for auto-approval, serial and parallel approvals using BPM work list in Fusion Cloud.

·         Auto-approval: To bypass approval and approve automatically

·         Serial Approval: To go for approval sequentially to a series of people

·         Parallel Approval: To go for approval to many people in parallel and approved when all people have approved


5.      Contract Amendments

Contract amendments is a full-fledged feature in Fusion Cloud. Retroactive contract amendments are possible using amendment effective date with transactions marked for automatic adjustments thereby reducing the manual effort. Also the amendments can follow the same approval workflow as contract.


6.      Project Invoicing

6.1 Draft Invoice Preview/Print

In Fusion Project Billing, you can preview the draft invoice and even print the same. So it can be checked in Projects before sending to receivables for final billing. There are seeded Business Intelligence Publisher (BIP) reports available to be used directly or you can create custom BIP reports also.

6.2 Taxation

Another change in invoicing is on the tax side. In EBS, tax was not calculated on project invoice. Only when auto-invoice import to AR was done, tax got calculated and appeared on AR invoice directly.

In Fusion Cloud, tax calculations can be viewed on project draft invoice itself. The tax rules are setup in Fusion Cloud Tax and tax code can be included at Contract, Contract Line, Expenditure Type and Event Type levels. The tax code can be changed at invoice line when doing invoice review if required.

The process follows the same route from draft invoice approval, release, and transfer to receivables and update from receivables after auto-invoice import.


7.      Project Revenue Recognition

7.1 Revenue Processing

On the revenue side, there is a major change in how revenue is processed in PPM cloud. In Oracle EBS, you could review and release revenue lines in projects. The draft revenue could be deleted and regenerated if it is incorrect. In Fusion Cloud, there is no review/release process now.

7.2 Accounting

On the accounting side, oracle projects auto-accounting in EBS is totally removed in Fusion Cloud. Fusion Sub-ledger accounting (SLA) is the only source of accounting rules now. For project revenue, projects SLA is used to derive the accounting while for invoices, it is derived from receivables SLA when accounting is generated for project sourced invoice in receivables.

8.      Conclusion

This is an attempt to provide an overview of the way contract-project billing takes place in Fusion Cloud PPM. It also tries to present a relative account with Oracle EBS. There are fundamental changes, process changes as well as many new features in Fusion Cloud PPM which effort to make the functionality stronger and more flexible to handle the complex contractual/billing/revenue recognition needs of customers.

 

 

October 5, 2017

Understanding the "Pulse" of HR

 ICON_Evolution of HR_PNG.png

Evolution of Human Resources 

In the past, the role of the Human Resources (HR) function was to make people and organizations grow. It revolved around administrating employee life cycle from hire to retire process, primarily managing attendance, trainings and compensation.

The HR team used to perform functions that added limited value to organization, like performing mundane tasks and operational reporting like % complete for   performance reviews completed, number of employees hired , number of managers trained etc.. In case of large or global organizations where the HR data resides in various disparate systems, at times HR faced issues due to inaccurate reporting of head count data as- one of the important decision data point for business & finance. All of this resulted in HR functions being merely treated as an administrative function.

Over a period of time, the job of HR evolved in managing employee aspirations, organizational behavior, employee motivation etc. Performance objectives & measurable assessments became part of the KPI. In the recent times, there is an expectation for HR to be a "Business Partner" and thereby, they must see themselves as business people who specialize in HR and as a partner that can significantly contribute towards the organization's business goals

icon_expectation from Modern HR_PNG.png


Expectation from Modern HR

It is time for HR to reassess their role and contribute to successful organizations of the future. In today's world, HR partners need to be agile, creative & innovative. An HR Business Partner is not only expected to be aware of the organization's vision and align its workforce to its goals but they are also expected to have an end to end understanding of organization's business. From having a background of the specifics of how the business works, they are expected to have an understanding of financials to be aware of the performance of peers and competition. This understanding is vital for HR to be able work closely with business leaders. 


icon_getting ready for next gen HR_PNG.png

Getting ready for Next Gen HR

In today's world, there is a change in outlook to recognize employees as Human "Capital" instead of Human "Resources". HR needs to be more digital than manual, more proactive than reactive and more strategic than operational.  The expectation is to be predictive & analytical and analyze the data behind the data. In order to be ready for the Next Gen HR business partner, it is critical to identify key performance indicators and effectively measure them and assess vis-à-vis industry standards. 


Today, HR partners need to ask  few strategic questions like:

- Are we looking at the data beyond  the HR reports and dashboard?
- How am I doing vis-à-vis my peers & competition?  
- How do I keep track of my performance compared with industry benchmark? 
- How do I quantify performance across various HR functions? 
- How do I identify problem areas and measure actual improvement?

icon_problem-what to measure and how to measure_PNG.png


The Problem - What to measure and How to measure?

As mentioned by HR Folks International - Human capital is largely intangible and difficult to measure as a component in a company's business success.  It is very important to measure and analyze the priorities so you can recognize the areas of improvement and identify opportunities.

In my opinion, effective measurements of HR KPIs are critical to the success of any organization. When it comes to measurement of outcomes or KPIs, my observation has been that HR has limited tools as compared to other divisions within the organization. There are well established frameworks to measure performance of departments in organization (e.g. Finance, or Sales or production unit etc.) and co-relate the impact to the business. However, when it comes to measuring the performance of HR, it becomes subjective and quite a challenge to quantify the outcome and its tangible impact to business.

As a result it becomes imperative to not just measure and quantify, but also identify those indicators for measurement which actually contribute towards the organization's vision and goals.

icon_treat the cause and not the symptom_PNG.png


Treat the cause and not the symptom

It is not only important to look at the scorecard and analyze the results but it also important to look at the underlying "data behind the data" to trace the root cause, identify areas of improvement and take corrective action.

For e.g., let's take the Business Function of Talent Acquisition and analyze Recruitment Metrics & Scorecard for KPI like Time to hire" to "Cost to Hire"

Key Questions -

Is my organization taking more time to hire employees thereby having a direct impact to employee productivity and top line revenue? 

Is there higher cost to onboard an employee thereby having an impact on the company's bottom line and losing edge over competition?

Conclusion - After the analysis of the scorecard in HR Pulse solution,  it looks like I may have to review the end to end hiring process and remediate the bottle necks  that are leading to higher turnaround times to onboard employees and there by losing competitive  advantage.

Impact - After implementing the re-engineered hiring business process, the HR Pulse provides ability to generate "Before" and "After" Scorecard for the same KPI and demonstrate continuous performance improvements and tangible impact to the business


icon_HR Pulse_PNG.png


Infosys HR Pulse -An Infosys HCM Scorecard Solution

Infosys HR Pulse is an analytics solution that provides quantitative and visual measures of performance indicators in the form of a SCORE CARD. The solution defines the Metrics & KPIs that are critical for an organization with the ability to track performance of the indicators against the organization's targets and industry benchmarks. The solution helps to measure and track progress, highlights improvements and deterioration trends and pinpoints to the HR leadership, an accurate idea of their performance. The purpose of the Infosys HR Pulse is to enable the HR Partners to monitor what matters most, so that they can focus on meeting strategic objectives. 

The key features of HR Pulse are

- Generation of metrics pertaining to relevant HR KPIs and empower HR partners to formulate strategy to take corrective action and improve measures

- Logical grouping of HR Metrics and Key Performance Indicators within business functions

- Provide ability to slice & dice the data and display trend analysis across entities (business units, locations, department etc.)

- Compare performance of various HR Metrics with organization's internal targets and service industry benchmarks.

- Introspect and align actions with strategy for continuous performance improvements.

- Primarily focus on improving the performance and contribution of Unit HR Manager all the way up to the macro-level decision making of Senior HR Management

- Enhanced decision making capability of leadership by highlighting areas which need attention

- Enable HR quantifying corporate objectives with the performance of every employee  and department in the workforce

 

icon_HR Analytics_PNG.png

HR Analytics as the foundation - Infosys HR Pulse

HCM analytics is the foundation of all HR strategic initiatives. The important trend analysis that will help in decision making can be enabled by HR analytics. In today's digital world, one would think that HR analytics would be a regular feature by now, but contrary to this belief there are many organizations that are still exploring the possibilities to use HR analytics.

Infosys HR Pulse offering can be a part of the HCM implementation on Cloud or On-Premise. This solution can be deployed along with the HCM implementation and lay foundation for HR Analytics as the foundation.

This solution can also be deployed for clients who have implemented HCM on Cloud or On premise and enhance their HR analytics capabilities that help identify functions that are performing better and areas that need improvement.  The build-in adapters that comes along with Infosys HR pulse solution provides seamless integration with leading HR On premise ERP or Cloud solutions.

I believe once HR is effectively able to measure and quantify the outcomes, it will certainly have a positive impression within an organization and contribute to make an impact on the organization goals and vision. I think this will go a long way to establish HR as a key business partner and strengthen its value in an organization that it truly deserves!!! 

  

October 4, 2017

Product Provenance and Supply Chain Transparency using Oracle Blockchain Services - An Infosys offering!

Recently Oracle has joined the Hyperledger consortium to offer its own Blockchain cloud service. In next few days (Oracle Open World; October, 2017) Oracle will bring to fore the next big offering from its stable - Oracle Blockchain Cloud Services (BCS).  Hyperledger fabric enabled Oracle Blockchain service on the cloud will help customers build new blockchain-based applications and help existing Oracle customers to extended their SaaS, PaaS, IaaS and EBS applications. Infosys being a Cloud Elite partner of Oracle Cloud services is delighted to offer the first completely integrated SCM cloud-blockchain solution solving some of the most complex and challenging supply chain puzzles.

My earlier white paper published on infosys.com discussed at length the challenges being faced in the blockchain today. It ranged from the integration challenge between blockchain with ERP, interoperability between established enterprise platforms (ERP systems) and blockchains, and security and compliance for enterprises to interact and share solutions and transactions. Oracle Blockchain cloud services is like a breadth of fresh air to these known issues.

 

Infosys Oracle Supply Chain competency team along with Blockchain competency at Infosys researched and identified some key supply chain transparency issues plaguing their existing CPG and Industrial customers and developed a use case to solve this problem using blockchain. Infosys used the Oracle SCM cloud, PaaS applications and Oracle Blockchain Cloud Services and developed a Product Provenance solution and also leveraged Smart contracts for improving supply chain compliance. The solution use case has 3 main entities Customer, Distributor and Manufacturer all connected to the centralized Oracle Blockchain Cloud Services network.

 

The use cases traces the origin of the product from the manufacturer to Distributor to Customer and also tracks the change of ownership using the Oracle Blockchain features and Oracle SCM cloud and PaaS application architecture. This not only provides product provenance from cradle to grave for a produced lot but also complete supply chain transparency and availability visibility across supply chain echelons which was impossible without blockchain. The product provenance tracking using the Oracle BCS will help reduce counterfeit in the product in the supply chain as well as help trace global availability.

The solution also uses Smart Contract (Chain Code) to validate the authenticity and compliance of the product for its Country of Origin as desired by its customer. PaaS applications is developed on Oracle SCM cloud architecture to provide User Interfaces to mock up Distributors and Manufacturers business systems to update the product details into Oracle Blockchain thus truly using the strength of Oracle technology; easy integration capabilities with SOA, APIs and availability everywhere promise of cloud. The product before being sent to customer is inspected on a PaaS application with reference to the Blockchain info using the smart contracts and on passing the inspection successful receiving and putaway is done. On failure of the smart contract compliance from Blockchain the receiving is rejected by customer in SCM cloud. The entire integration, validation is touch less and paperless and entirely based on Blockchain info which is immutable and publicly available (smart contracts). Oracle solution is simple to use and extend existing Cloud investments. It took less than a week for Infosys to build this solution using Oracle Blockchain Cloud Services.

The solution has already started garnering rave reviews and customer interests. Please do visit us at Infosys booth number #1602 to learn more about the solution, its details and how it can benefit your company achieve supply chain transparency using Oracle Blockchain Cloud Services.

If carriers use OTM mobile...

 
Picture a carrier's user tweeting and texting, his fingers poised earnestly on his mobile, his eyes awash with colorful imagery, and his mind dizzy with an abundance of sensory stimuli.

Now imagine the same user hunched up at his desk, his shoulders drooping like the wilting branches of a neglected roadside tree, staring at his monitor, responding to tenders in OTM, his brain nearing the point of self-imposed hibernation.

Before presumptuously advocating the use of OTM mobile for everyone - in this case, the carriers - let us look at the benefits if there are any

 

1.Shuffling order movements


A carrier receives multiple shipment tenders from his manufacturing or 3PL partners. The carrier proceeds to accept some of these tenders. Once the tenders are accepted, the carrier sends his trucks to fulfill the transportation services that he has hitherto accepted. At this point, the shipment is frozen, that is to say, the shipment cannot be modified by way of adding or removing orders from it.

But of course this is far from practical. Drivers assigned to the trucks can't make any rearrangements, namely, swapping a whole order or part of an order with other drivers. Once the tenders are accepted, the process is quite rigid and inflexible. If it is the same carrier who is operating in the same lane and his fleet comprises of multiple trucks, it should be possible to rearrange, isn't it?

Consider this - Truck 'A' with shipment 'A' onboard set off to its destination. But there is a breakdown and the truck is unable to deliver the goods. The driver swishes in his pockets and pulls out of his mobile, pinches his screen to locate his truck. Now, on the map, he can see a couple of other trucks just around the corner. He extracts the equipment utilization report and finds that the other trucks are underutilized and that they can carry some of his orders, the ones that are labeled 'overnight' or 'expedited.' He quickly summons the trucks and unloads few items from his truck, scans them judiciously as he hands them over to the other drivers.

Wouldn't it be useful to be able to do achieve this in a mobile application?

To make this happen, the carrier's transportation system, OTM or otherwise, must be able to send an actual shipment XML to the source OTM instance that has the planned shipment. The planned shipment is associated with order movements via the shipment equipment. Once the actual shipment XML is received, agents can be used to identify the order movements that were offloaded from the truck that broke down in transit and remove them from the shipment equipment. Similarly, the planned shipment of the truck that carried order movements in addition to what was accepted in the tender has to be modified by adding ship units to its shipment equipment.   


 2. Carrier invoices with delta cost


Some carriers invoice their partners regularly as and when the shipments are delivered. But most carriers invoice periodically, namely, month end invoicing. Now, a lot can happen between the first and last day of a month - the sun may run out of its hydrogen atoms and the earth may be plunged into eternal darkness! There is very little that OTM can do to handle sun's demise...

But, OTM of course can be configured to handle other miserly exceptions that arise purely out of the way the logistics industry operates. For instance, by the time the carrier invoices its 3PL or manufacturing partner, the contracts would have been renegotiated or the surcharges could be updated. The way we handle these delta changes between the invoice and the matched shipment is by configuring the service provider to copy delta costs at the time of approving the invoice.

Now, let's add a bit of flavor to this. Let's say the driver is unable to take the usual route due to unforeseen accidents on that route. He takes a different route and ends up paying for driving thru multiple toll gates, and even booking for an overnight stay at a roadside inn. These additional expenses are usually 'customer recoverable' and the driver should be able to flag them on his OTM mobile application.

If the driver also decides to get his truck's headlights fixed or change the brake pedals, it is hardly a case for recovering from the customer though. 

At the end of the month, before invoice is made out to the customer, the drivers' supervisor receives a notification on his mobile while he is on the site, busy assigning shipments to this fleet. On his OTM mobile, he gets to review the additional costs incurred against each invoice, the estimated and actual invoice amounts paired for quick reading.

Other examples of these accessorial costs could be the original driver enlisting help of other truck drivers. So, for instance the truck has left location A and is on its way to location D via locations B and C. This is a multi-stop shipment. On his way, the driver is alerted on his OTM mobile of another truck driver in his vicinity who is on his way to location D, this being a direct shipment. The second driver is shipping a return delivery which happens to be completely unplanned. Owing to the nature of this return delivery, his truck now appears as a notification for other truck drivers near him on the integrated Maps application. The original driver can now choose to offload some of his orders onto the return delivery truck depending on the other truck's equipment capacity.  Now that some of the orders have been offloaded, the original truck may not need to visit few locations on his route, thereby reducing the overall cost. The original driver should be able to indicate this on his OTM mobile which must transpire as a negative cost line item of his truck's invoice. The return delivery driver may incur an additional fuel surcharge which would correspond to a new accessorial cost on his truck's invoice.

To achieve this, we would have to add invoicing functionality to OTM mobile. Picture this interface alongside the standard set of screens that we are already getting in standard OTM mobile -

Carriers can select/deselect few order movements with the click of a button and promote the changes all the way to invoices. We can trigger invoice XML from the OTM mobile once the driver makes his edits and logs a delivery event at the destination. This way the information between orders and invoices is always in sync and few invoices would fail auto-approval. Also, the invoice-generation itself will be real-time thereby eliminating manual reconciliation which is taxing and prone to errors.

In short, making carriers more inclusive in the digital transformation.

 

Engage with our experts at #OOW17 booth 1602 & learn how you can transform your #digital capabilities infy.com/2vSljwe #InfosysAtOOW


Written by: Kranthi Askani


To Cloud or Not to Cloud: Make the leap of faith logically!

HCM and HRIT functions are veering towards adoption of cloud applications to effectively address the trends such as mobile workforce, uptake of digital platforms and emphasis on user experience. While the decision to move to cloud seems a no brainer, the shape and velocity of this adoption requires a more involved approach. CXOs have to contend with balancing their current operations, assessing the readiness of the organizations as well as resistance to change internally when making this decision. The blog examines the dimensions in an enterprise's decision mix enabling them to choose the right path to HCM cloud solutions.

 What is the trend?

Trends-HCM-2.jpg

 The HR function and practice continues to evolve rapidly in response to the business world as well as to the societal changes from where it derives its capital from. The evolution is also getting influenced by technology.  

In organizations, the focus of HCM and HRIT functions has been on meeting operational requirements including compliance and administration while integrating with rest of the organizational business systems.

However, with the advent of millennials, trends such as mobile workforce, attitudinal changes such as increased desire for collaboration, uptake of digital platforms and emphasis on user experience have come to the fore.

These underlying trends are causing a rapid shift of HR role from transactional to tactical to strategic. This, in turn, is creating a demand for technology platforms that enable this shift. Foremost among these enablers and shapers are cloud-based HCM applications that offer sophisticated functionalities.

Is the Trend your Friend?

Moving to Cloud-based systems at first glance appears to be quite logical and an easy choice to make. Well, as many CHROs, CIOs and CTOs experience, it may not be an easy decision to make. Intrigued?

Imagine you are the CTO of a retail organization with around 30,000 employees. You will most probably have an enterprise-wide HRMS system based on a leading package such as PSFT or Oracle HRMS. To manage your needs in recruitment, temp staffing, time clocking & tracking the system landscape would include other applications. It is also quite likely that for the core functions you might have customized (bolt-ons etc) the enterprise HRMS system to meet your unique needs. With a view to getting richer functionalities and potential infrastructure cost savings, you understand that Cloud Applications may be the way to go.

Question-HCM-3.jpg

Then out tumbles some nagging but very pertinent concerns: does it offer the same amount of functionalities to meet my needs, will my business stakeholders be fine with what is offered, can the cloud handle the integrations, what about the customizations, what about my existing infrastructure investments...The list goes on...


 Is that all?

The initial set of problems that come to mind mostly deal with the ease of moving to Cloud. But was there a problem in your current HR applications that was constraining the HR functions to begin with? Is there a need for moving to cloud or doing anything at all? If there is a need, how pressing is it? Determination of the need is vital not only to make a business case from a technology or business standpoint, but also to prioritize the areas where you want to focus on.

EaseVsNeed-2.png

Need could be an expressedly Felt Need or an unstated Underlying Need. A Felt Need usually comes in the form of specific instance of users & stakeholders regarding the state of affairs of the application or function whereas Underlying Need is typically a desire for better functioning/functionalities or general opinion of the application or process. The stronger the Need, the higher the drive or support for the remedial action. The action proposed should deal with what specific functionalities and processes to be moved to Cloud, the sequence of adoption clearly articulating how these would address the Need dimension.

 Ease, Need and then...

One of the aspects, though known, that is underestimated, more often than not, is gauging whether the organization is ready for cloud adoption. It is important to recognize that the change management process begins even before the decision to move to cloud is made. The Adoption Readiness of key stakeholders from the HR organization as well as IT organization needs to be assessed. Readiness to adopt cloud can be considered in two parts -(a) Is there Awareness of the capabilities of Cloud Applications? (b) Do key stakeholders show sufficient Intent to move to Cloud? An analysis in these terms can help CXOs decide what actions to take to increase Adoption Readiness. If Awareness is found out to be low, then knowledge sessions on Cloud may be undertaken; in case of low Intent, workshops to understand the underlying concerns may be considered.

AdoptionReadiness-HCM-1.jpg

Understanding Adoption Readiness across groups is also important. It is quite possible that the IT stakeholders are gung-ho about the movement whereas the business stakeholders may not be mentally ready. This insight will help CXOs/ sponsors in getting an advance view of which group to work on for buy-in and in the process unearth not-so-evident issues in the Cloud Adoption journey.

 What next?

After collecting data pertaining to these dimensions in preparation for cloud adoption, it is necessary that a sum total analysis be performed. It is imperative that the interplay of these dimensions and the state of the organization be clearly understood to chart out the path, roadmap if you will, that your organization's cloud journey will undertake. A well-thought out plan based on the inputs as mentioned above, will dramatically increase the chances of a cohesive, right-paced and successful Cloud Adoption.


OTBI ON ORACLE CLOUD FUSION APPS - AN OVERVIEW

Oracle provides multiple ways of reporting from Oracle Cloud (Fusion) Apps, one of them is Oracle Transactional Business Intelligence (OTBI). While, OTBI is built on the OBIEE platform there are few limitations as compared to the on premise OBIEE features. This blog will give a high level understanding of the OTBI capabilities and limitations on cloud, which will especially be helpful during the requirements gathering to see if the requirements are suited for OTBI capabilities, and during the design sessions to fit the requirements with the many tools that OTBI provides.

WHAT IS OTBI AND WHY IS IT POPULAR

OTBI is included as part of the Oracle Cloud Apps licenses and does not need any additional setup. This includes out of the box reports, subject areas which are built to work with the Fusion view objects (VOs). In addition, there are tools for customizations namely, OBIEE Answers, BI Publisher. Also, fusion allows to create custom subject areas using the Application Composer tool.

Thus, OTBI aims to make reporting easier for Managers and Analysts by providing tools to simply drag and drop fields from subject areas, while also providing tools for complex customizations when required.

OTBI is attractive for customers for the following reasons:

·         All setup, including the security makes use of the fusion architecture. Hence, setup time is low.

·         Provides real time reporting capabilities.

·         Report customizations are possible with the OBIEE platform at no extra costs. Translates to huge value for money.

TOOLS AND CAPABILTIES

While OTBI comes with pre built reports, it comes with tools to create custom reports for requirements which aren't met by the out of the box reports

OTBI comes with the following tools:

OBIEE Answers (also referred to as OTBI in some documentation)

Answers provides users with an ability to access the out of the box functional subject areas to create reports by simply dragging and dropping columns to create reports. In addition, users can add filters, custom formulae and use the different views which are provided. Data and Object security in OTBI uses the Fusion security model and hence no separate configuration needs to be done.

In cases where the subject areas do not have the fields which are needed to build reports, Fusion provides us a couple of customization options.

Flexfields

Flexfields are placeholder fields in Fusion application related to business objects. These fields are used to extend business objects and display data on application pages and can in turn can be enabled to be used in OBIEE subject areas.

Once we enable the flexfields in Fusion, we would need to associate them to the appropriate dimension or fact so that it appears in the desired subject area.

Custom Subject Areas

Fusion applications also allow us to create custom subject areas using a tool called application composer.

To create a custom subject area, we need to identify the fusion object which is the main focus of the reports and build the subject area using this object. In addition, we can add any related or child objects, whose fields are needed for reporting.

In cases where the object we need to report on is not related to any primary objects, we would need to create the relationship between the objects so that the fields are available in the custom subject area.

 

BI PUBLISHER

When reports have specific requirements like pixel perfect published reporting, splitting of reports (by person or date or delivery type) or if requirements cannot be delivered using OBIEE Answers, OTBI provides BI Publisher.

Using BI Publisher, we can write custom SQL queries to hit the fusion tables and get the required output. BI Publisher also provides the bursting feature where we can split the reports on particular fields and deliver using delivery devices like Email, FTP, Printer etc.

Data security can be configured using the security view in Fusion tables and passing session variable for user or role name which are in built in BI Publisher.

 

LIMITATIONS

While OTBI provides a lot of features to build reports, it still has some limitations in comparison to the traditional way OBIEE is used for reporting.

       OTBI is for Oracle Cloud Apps only; we cannot report on any other sources of data.

       It performs online BI queries against the Cloud Application database directly.There is no data warehouse with advanced metrics and it lacks historical data, which limits the type of analytics that can be done.

       OTBI RPD cannot be customized. Minor customizations can be done using flexfields to expose some columns in OTBI subject areas. For major customizations we need to use BI Publisher.

       Any changes to configuration parameters can only be done by submitting an SR to Oracle with a business justification.

       The Fusion Apps database on cloud cannot be queries using tools like SQL Developer. So, the only way to query the database is by using BI Publisher. This is mainly a developer pain point.

 

SUGGESTED GUIDELINES DURING REQUIREMENTS AND DESIGN PHASES

OTBI Workflow 2.jpg
















The above is suggested approach before deciding which tool would be used to deliver reporting requirements.

It is always preferable to build reports using OBIEE Answers/OTBI subject areas for 2 main reasons. Firstly, reports built out of the subject areas are interactive in nature. Users can drill up/drill down, navigate to another report based on actions which are configured on the report.

Secondly, the subject areas are accessible to Analysts and Managers based on access provided. This allows them to create reports easily by choosing columns from the appropriate subject area using the drag and drop approach. Also, the configurations are reusable for creating more reports.

However, if either we can cannot configure the subject areas for reporting requirements or if the requirements specifically require published reporting or BIP specific features like bursting, then we would go for the BI Publisher approach. These reports are static in nature and while you can filter data in the reports they are not interactive.

Also, the custom SQL's built are usually complex in nature since the data in the Fusion tables are normalized unlike the typical data warehouse. This means data would need to be pulled from multiple objects causing the code to grow while also impacting performance due to huge data which needs to be processed.

To summarize, the preferable approach is to use OTBI subject areas to build reports and BI Publisher must be used if all else fails.

Security vulnerability in Hyperion EPM

This blog lists the security vulnerabilities and threats present in the EPM system, which can lead to misuse of highly sensitive financial data and information.

Data and Information security is an important concern for every organization, and when a system stores, processes and manages highly sensitive financial data - it becomes even more critical for organizations to opt for the highest controls to ensure system and data security.

EPM suite helps to deliver a comprehensive, integrated suite of applications featuring common Web interface and reporting tools. It stores and processes financial information of organizations belonging to various fields (some of the example could be banking, manufacturing, Medical, Public sectors).  Vulnerabilities and threat present in such a system can lead to a huge negative impact on the organizations.

I am listing few critical security vulnerabilities which are present in EPM system:

  • Reflected Cross-Site Scripting: EPM Hyperion processes the user inputs on the server without performing the validations on the inputs. This behaviour of EPM makes it vulnerable to reflected cross-site scripting. The malicious input reflects back in the subsequent HTTP response. With a compromised user session, an attacker can perform unauthorized actions in system, like tracking the user operations, redirecting the user to a fake site, modifying the web page, and exploiting the browser.

  • Unrestricted File Upload to Hyperion system: The Hyperion web application does not validate the type and content of files before they are uploaded to the server. Executable files can be loaded and downloaded on the server. This allows an attacker to upload malicious files (including viruses, malware, trojans or executable files) with the intention of them being downloaded by other users.
  • Clear Text Traffic: The Hyperion application servers/infrastructure is not configured to enforce to use encryption when communicating with other hosts. Having secured communication not enforced in a highly sensitive system can be subjected to a number of passive and active network attacks that may result in the interception and/or modification of the transmitted data.

  • Web Server Version Disclosure: The Hyperion web servers expose sensitive information in their headers.

    As part of the HTTP/1.1 standard, web application servers append information about the software used to handle the request in the response headers. These headers unintentionally reveal sensitive information like server type and version. Knowing the server type and versions allows an attacker to research published vulnerabilities associated with that specific server.  Information gained can be used to launch more targeted, sophisticated attacks against the system.

  • Client Side Control Bypass: The Hyperion web application relies on client side controls to prevent users accessing certain functionality.

    By modifying HTML elements and JavaScript responses, it is possible for users without authorization to access the Configuration Settings and Credentials Used For Pass-Through functionality.

    A malicious user can modify the configuration settings and pass-through credentials without the required permissions, which may cause integrity and non-repudiation issues.

  • Improper SSO Token Expiration : Upon a disconnect command (click on disconnect button) is issued by a user, the Hyperion SmartView/Disclosure Management plugin does not invalidate a user's SSO Token. Hijackers can use this opportunity to perform session hijacking in the application. The hijacker can then view the sensitive information and perform actions on behalf of the victim user in the application.

Few of these vulnerabilities can be remediated by implementing the fix provided by Vendor and hence it is highly recommended to raise these vulnerabilities with vendor to seek remediation.


 


October 2, 2017

Hyperion planning Metadata Management using Smartview

Planning 11.1.2.3 onwards, Oracle has provided great utility to perform BULK changes in Planning Metadata using Smartview, called as Planning Admin Extension (PAE). You can use this tool for your metadata management in planning.

What we have to do to use it?

                Before using the tool, we first have to install it (if not already installed).  Installable can be downloaded from Workspace -> Tools -> Install -> Planning Admin Extension

PlanningSVExtension.msi will be downloaded and its very easy to Install.

To confirm if the extension is installed, Open Smartview -> option Tab, under extensions 'Hyperion Planning Admin Extension'

What all can we do?
  •  Add bulk new members.
  •  Edit existing dimensions or bulk members.
  • Move members within Hierarchy.
  • Change member properties
  • Create or Refresh Cubes.

All of above task you can do it just using Smartview.

How can we do it?

Connect to Hyperion planning in Smartview (Use 'Oracle Hyperion Planning, Fusion Edition' connection from drop down).