Infosys’ blog on industry solutions, trends, business process transformation and global implementation in Oracle.


November 9, 2017

Transaction matching of around two million records in under 5 minutes in ARCS

Oracle Account Reconciliation Cloud Service (ARCS) with Transaction Matching is a cloud based reconciliation platform with pre-built configurations and adherence to industry best practices; a recommended solution to cater to your reconciliation and matching needs.

Transaction Matching is a module within ARCS which inherits the features that facilitate preparation and review of reconciliations.

  • Additionally, Transaction matching adds efficient automation of the detailed comparison of transactions in two or more data sources
  • The calculation engine allows for intuitive "netting" of transactions within each data source to provide output which is easy to work with
  •  Flexibility in the setup and timing of the process allows to minimize the effort during "crunch time" and reduce risk


Transaction Matching Use Cases

Typical Transaction Matching Use Cases are shown below.


Use Cases.jpg

Often clients need to match more than million records between two source systems with complex match set rules. We have seen clients spending hours to try to manually match them in excel or use some solutions like Access database, Oracle tables etc. which can be very time consuming and have data quality issues. We will share our experience and some insights on how we successfully loaded and matched two source files with around 2 million records in less than 5 minutes using Transaction matching feature of ARCS for one of our e-commerce client.

Idea Inception

Client wanted to match up to 2 million records from their point of sale system (POS) and the details obtained from Merchant transaction system. They were using access data base for this activity which was giving them results in hours and they reached out to Infosys with this requirement to help them streamline this time-consuming and frustrating process.


Solution and Approach

Source Files.

1. Point of Sale transaction file.

    The POS file had 9 columns and the file provided was in txt format (a pdf report converted into a text file). Below is the snapshot of the same.


2. Merchant system transaction file

           The Merchant system transaction file had 21 columns and the file was in csv format. Below is the snapshot of the file.


Matching rules

Client wanted the matching rules to be based on the condition that the card number and the amount from POS transaction file matches against the cardholder number and amount from the Merchant transaction file with the stipulation of many to one transaction match where many transactions from Point of Sale system matches with single batch (grouped by amount) transaction from Merchant system file.


Initial Challenges

The initial challenges with this requirement are below

1. Size of File.

    The size of the files provided were huge as there were 9 and 21 columns respectively and both the files had around 2 million records resulting in file sizes of > 1 GB per file. This much large a file is difficult to read and edit by any text editor.

2. Formatting

    Another bigger challenge was formatting the given files as per ARCS transaction matching needs. The files provided were in text format and to read and format them given their file size was a tough nut to crack.


Infosys Solution

We took this challenge and delivered as promised. The biggest challenge was to import the file containing about 2 million transactions into the ARCS Transaction matching from both the system and match them automatically in quick time. Other tools and custom solutions were taking hours for this process. Importing 2 million records in a csv file is a huge input for any system to ingest. It would typically take anywhere between 15-30 minutes just to import one file into a system. We had another challenge in formatting the files because the file we received was a .pdf file converted into text format and we needed them to be converted into .csv to be accepted by ARCS Transaction Matching. We used Oracle ARCS TM, formatting tools, text editors and Oracle provided EPM Automate utility to format the files, automatically ingest and auto-match the files from two transactional systems.


The EPM Automate Utility enables Service Administrators to remotely perform tasks within Oracle Enterprise Performance Management Cloud instances and automate many repeatable tasks like import and export metadata, data, artifact and application snapshots, templates, and Data Management mappings.


Tips and Lessons Learnt

With the above requirement's implementation, we have learned a few lessons and below are some tips when implementing similar type of solution.

  • ARCS TM also accepts .zip format input files, hence compress the files into .zip format so that they are smaller in size plus quick and easy to upload on the ARCS cloud.
  • Powerful text editors like Notepad++ or Textpad when formatting the files, could be used.
  • Create custom attributes which can be used in matching rules for faster auto-matching of transactions.
  • If possible, try to get the export from the transactionsystems in .csv format to reduce conversion times.

Performance Metrics

Below are our performance metrics while implementing client's requirement of matching around 2 million records using Oracle ARCS Transaction Matching.


Import POS million records - 27 seconds

Import Merchant million records - 61 seconds

Run Auto Match - 53 seconds


Complete Process - 2 minutes 21 seconds (Less than half of 5 minutes)




Happy client and Happy us.


We deliver!!!! - Please visit our company website to know more about our Account Reconciliation and Transaction matching solutions.

October 31, 2017

Unravel the mysteries around Equity Overrides

In this blog, I would like to share one of my experiences with HFM translation rules; it is still engraved, because it was a major issue that I had faced in the initial days when I had just started playing around with HFM & was keen on unravelling the mysteries that revolve around translation issues. During one of the implementations, I had a brief understanding of the historical rate concept, which we usually encounter to translate a subsidiary's equity.

So, before I proceed with the problem statement & the resolution, let me define historical rate for the beginners (especially who do not have a background in Finance, just like me) in the field of HFM. Historical rate is the exchange rate which was prevailing at the time of the transaction consummation. So, these transactions (mainly Investment & Equity) have to be translated at Historical Rate rather than using EOM Rate or AVG Rate. This is usually coming from clients who are reporting in multiple GAAPs, or I must say, US GAAP particularly.

Let me describe the issue in a practical example:

Org A invests USD 1 million in Org B, which reports in AUD, at an exchange rate of 0.7, hence making the subsidiary receive AUD 130,000. In A's books, the investment is USD 100,000; while in the books of B, there will be equity of AUD 130,000.

Now, in future, suppose the exchange rate becomes 0.8. Here, the translated equity for subsidiary, i. e, Org B becomes USD 104,000; whereas, for Org A, the investment still remains, USD 100,000 in the books. Hence, at the time of elimination the plug account will capture an imbalance of USD 4000, which actually is coming in due to the incorrect exchange rate being used for translation of the transaction. And the actual transaction is nowhere to be blamed for the mismatch. Hence, there is an urgent need for some solution to report the correct investment & equity in B's books, or else, the reported values would be incorrect.

Now the first thing, that struck me was, why don't we capture the translation through a rule which would take care of only the changes in equity & differences in A's investments during the entire month. Hence, this would automatically be taken care by standard EOM Rate, i. e, Balance Sheet rate, which is pre-defined in the application. But there was a gap here, suppose A invests in B on the tenth working day of the month. At his point of time, the rates are quite different.

Hence, the solution revolves around using Equity Overrides. But how to achieve this was another big hit. This would benefit the users, by rendering complete hold of the authentic translated values that were required to be shown in the Balance Sheets of the subsidiary organization.

We must be manually capturing the Historical rates for conversion through a separate Data Form defined in the application. The values then be translated using the historical rates using the rule file, overriding the actual rates. The difference arising  would be captured in a separate account, which we refer to as a Plug account for currency overrides, i. e, Foreign Currency Transfer Adjustments (FCTA)

Continue reading " Unravel the mysteries around Equity Overrides " »

October 30, 2017

Analytics and the APP!


Welcome Back!!! In parts 1 and 2 we started out to understand the concept of analytics and the app (or analytics on a mobile platform) and review a few case studies from different leading products - Kronos, Oracle, and SAP. In this concluding part we will look at the significance of these case studies and draw inferences as to how they impact the world of analytics...




We have seen 3 case studies across different verticals with varying background and use case scenarios. However all have the common feature of using an analytics tool on a mobile platform to showcase the versatility of this combination of Analytics and the App!


When organizations go mobile with analytics, they are able to extend the reach of information and empower people from every aspect of their business with the facts to make better, more informed decisions.

This is evident from the 2015 Mobile Business Intelligence Market Study by Dresner Advisory Services:

  • Mobile business intelligence (BI) is the third most common use case for mobile business workers, next to e-mail and contact information

  • 90% of study participants said mobile BI is important to their business

  • In a surprising find by a Dresner market survey (*) Business Intelligence is of the 3rd  highest priority in Mobile applications, ranking higher than social media and even personal banking, coming in below only email and basic phone services.


*SOURCE - Wisdom of Crowds ® Mobile Computing/ Mobile Business Intelligence Market Study 2015


Trends observed during the research on case studies indicate the growing importance of Mobile analytics in different verticals - IT being the prominent horizontal across most of the industries. Some of the reasons for this are listed below:

  • Exploding usage of 'smart' mobile devices in general - personnel, org-wide, technological leap

  • Growing use of BYOD among enterprise employees - personnel get more opportunity to tap into the client systems and data as organizations open up accesses to employees.

  • Rapid use of mobile devices for other aspects of daily life - communication, mails, social media, entertainment - to make a convenient platform for including analytics.

  • Flexibility of usage and availability on-the-go. From being a straight-line process to being agile.

  • Advanced functionality of apps and devices - inducing enhanced parts and software.

  • Technology growth to aid predictive analysis and user data customization.


Suggestions/Future Prep: 

  • It is seen that the concept of mobile analytics is well known but almost negligible in application. This could be leveraged further to achieve Customer Delight.

  • The analytics functionality on ERP systems remains a niche area. Consultants could be empowered with training on this module to also include the mobile apps that are usually pre-built for such applications.

  • Another option to be explored would be provision of sample tablet devices (i-pad or android) to respective practices so as to enable learning, hands on and PoC processes.

  • From the case studies and also from project experience, it is observed that even though customers may be aware of the implications of mobile analytics on their processes, a PoC is helpful in all cases to create the right focus to open up further avenues of engagement.



The advent of the mobile platform has been another epoch making event, probably making it to the top 20 inventions/events that changed lifestyles across the world significantly. Added to this event, parallel advancements in related areas like data analysis, cloud computing, big data, to name a few have been instrumental in converging the big with the best, giving rise to a concept such as mobile analytics. Since this concept is still in its nascent stage, it provides great potential for further exploration to discover the myriad use case scenarios and adaptability, which could lead to several success stories of - Analytics and the App!


End of part 3...hope you found this interesting - Please do leave your valuable feedback!

Part1 :

Part2 :


Analytics and the APP!


Welcome Back!!! In part 1 we saw an example of analytics being used on a mobile platform - tablet - to realize the retail store objectives and gain advantage of real time data updates. In part 2 let us take a look at more case studies across similar leading products...


Case Study 2:


Scenario - The client is a US based fortune 500 energy and utilities company with approximately 10 million customers. Their integrated business model provides a strong foundation for success in this vertical which is experiencing dramatic changes. They strongly believe in capitalizing on emerging trends and technologies to provide insight into operations that will drive value for their end customers


Background - The organization uses Oracle - one of the top ERP applications for their myriad business processes. As part of this PoC the Infosys team setup custom analytics solutions for the client. Oracle's business tool OBIEE 12c is used here to showcase the length and breadth of the analytics tool available as part of the wide array of modules in Oracle.


Problem Statement - The client needed to do a comparative evaluation between two mobile analytics applications as part of their PoC to be reviewed by their senior management.


POC details - The PoC was aimed at the OBIEE module's ability to work on a mobile platform. The PoC also aimed to do a comparative demo of features between Microstrategy (another analytics tool) and Oracle tools (apps). A set of commonly identified features was expected to be compared and in most cases, the feature was available within these tools but the enablement of the feature was different between OBIEE and Microstrategy.


Pilot & Feedback - For the pilot, the app was shared only among the senior management in the organization. The focus group was impressed to see that OBIEE could provide the features needed and appreciated the way it is achieved in OBIEE, which was different from their current applications. Further using OBIEE on mobile presented a very unique but scalable scenario as it proved to be a seamless extension to the existing suite of oracle products and which meant lesser chance of data integrity issues. Post the successful demo, client is now evaluating an option of a complete migration to OBIEE with preference to the analytics app as it aligns successfully with their established principles.


Being an energy and utilities company, it is always essential for the organization to possess the latest status and forecasts in a rapidly changing environment with unpredictable trends. With the analytics tool on mobile, it has brought the leadership very close to data and trends that were hitherto not feasible. Management can now make an informed decision much faster and just as easily track the results through OBIEE. Also, the time and effort saving is huge since it allows the stakeholders to pull their own graphs and data analysis, first hand and without chances of error. As the gap between technology, user and data/solution is greatly reduced leadership is also now very keen on applying this model to other areas of analytics.


Case Study 3:


Scenario - The client is a global tool manufacturing corporation with interests in construction, heavy equipment and technological solutions. They excel through outstanding innovation, top quality, direct customer relations and effective marketing. Client also has their own production plants as well as research and development centers in Europe and Asia. They pride in integrating the interests of all their partners, customers, suppliers and employees - into their growth and sustenance strategies.


Background - The client uses SAP package and tools for running their analytics platform integrating the various aspects of their business from planning to customer feedback & support. Combining areas like technology, design, real time feedback and automated order processing and metrics like quantity, geographical location, customer database, the analytics tool (SAP's BI system), provides the necessary inputs to the stakeholders to catchup on the latest available information/trend.


Problem Statement - The client needs an on-the-go platform to deploy their analytics solution to enable salesforce and service personnel to meet client demands as and when they arise in an automated fashion.


Introduction of Mobile Analytics - The organization has about two-thirds of its workforce employed directly for their customers in sales organizations and in engineering. They average about 200,000 customer contacts every day. This entails a constant need to be up to speed with the latest and greatest as regards the end customer data (or detail). A ready reckoner for this situation is the SAP mobile analytics (ROAMBI as it is known otherwise), that most employees in the organization use on a daily basis. Further, the entire solution is a cloud based model, so they have the best of both cases - cloud computing and mobile application. This has proved to be very advantageous to their on the job salesmen, technicians, customer support or even the top executives discussing an org-level strategy.

A real-life scenario involves the following situation:

  • A critical time bound customer order is not received at site on time.

  • However, the automated tracking tool, looking for the delivery report, has sensed it and raised an alert to the support center of the tools manufacturer.

  • This triggers the next set of established workflows in order to compensate for the delay in delivery.

  • Alerts sent to the nearest customer support personnel through a geo fencing feature enables the employee to locate the nearest outlet/warehouse/distribution center for the right part.

  • The support person raises a request under the right priority and is able to head over to the site to personally supervise the final delivery

All this has actually taken place on-the-go using the mobile device loaded with the BI tools and supporting applications to augment the corrective actions.

In this particular scenario, even the customer delight can be captured on the same mobile as feedback and, back at the corporate HQ, the top management will be able to gauge a real time heat map/graph showing customer satisfaction survey results that have been processed seamlessly through cloud.


End of part 2... in part 3 we will review the inferences and conclusion.

Part1 :

Part3 :

Continue reading " Analytics and the APP! " »

October 29, 2017

Migrate Oracle Hyperion Cloud Applications(PCMCS) on Autopilot!!! Using EPMAutomate

Migrate Oracle Hyperion Cloud Applications (PCMCS) on Autopilot!!! Using EPMAutomate

What is EPMAutomate?

EPMAutomate utility helps in automating administrator's activities for EPM Hyperion cloud products.

What is Migration and Why it is required ?

Migration of application in cloud is required to move an application from Test instance to Production instance and vice versa. Manual migration of application across the instance could take hours, it can be automated using EPMAutomate utility which subsequently reduce the time from hours to minutes. Migration of application includes Data, Metadata, Security, Data Management, Reports etc i.e every artifacts of application will be migrated using EPMAutomate utility without manual intervention. Migration can be server to cloud or cloud to cloud. It is always preferable to move backup artifact from server to cloud. Here example has been demonstrated with respect to PCMCS application .

Migration methods:

  1. Server to Cloud

  2. Cloud to Cloud

  1. Steps to automate Server to Cloud Migration from daily backups process using EPMAutomate utility in PCMCS

  1. Login into PCMCS Workspace by entering Identity Domain, User Name, Password and Click on Sign in


2. Delete existing application from instance if available, where new application will be migrated and imported from another instance.

Click Application->Application

3. Click on 'X' to delete application and click Yes


4. Now, modify the attached reusable sample batch script with relevant url and user credentials to automate Migration process using EPMAutomate utility

Sample Script:

@echo off

rem This script is used to perform On-Demand migration of Artifact Snapshot using

rem server backups i.e server to cloud migration

rem Update the parameters: url (Target url), Folder (source folder) and SnapshotName as per requirement

rem Please make sure application has been deleted from target instance before importing snapshot into it

SET url=

SET user=abc

SET password=D:\Oracle\password.epw

SET IdentityDomain=

SET Folder=D:\Oracle

SET SnapshotName=Artifact_Snapshot_09_13_2017_1322


call epmautomate login %user% %password% %url% %IdentityDomain%

timeout 10

call epmautomate uploadfile %Folder%\%UploadedSnapshot%

timeout 8

call epmautomate importsnapshot %SnapshotName%

timeout 10

call epmautomate logout           

5. Trigger .bat script :

Uploading relevant snapshot to cloud



6. Once migration completes, check for migration report in in PCMCS workspace.

Click Application->Migration->Reports->Migration Status


B) Steps to automate Cloud to Cloud Migration from daily backups process using EPMAutomate utility in PCMCS

  1. Follow steps 1 to 3 from section A

  2. Attached script to migrate artifact from one cloud instance to another


    'Copysnapshotfrominstance' command is used to move artifact snapshot across instances in cloud 


    Sample Script:

    @echo off

    rem This script is useful to migrate Artifact Snapshot from Test to Production instance 

    rem Update the following parameters based on requirement

    SET url=

    SET FromURL=

    SET user=

    SET password=D:\Oracle\password.epw

    SET IdentityDomain=

    SET SnapshotName=Artifact Snapshot

    call epmautomate login %user% %password% %url% %IdentityDomain%

    timeout 10

    call epmautomate copysnapshotfrominstance %SnapshotName% mdeshmukh %password% %FromURL% %IdentityDomain%

    timeout 8

    call epmautomate importsnapshot %SnapshotName%

    timeout 10

    call epmautomate logout

3.Rest of the steps are similar as in Section A




October 27, 2017

Replace Query tracked Aggregate views of an ASO cube to another cube and aggregate

Essbase Aggregate storage Database is a Multidimensional Database. Aggregate storage enable dramatic improvements in database aggregation.

Aggregation is a process of calculating and storing data of aggregate views to enhance retrieval performances.
For an ASO cube we can perform Default aggregation and Query tracking aggregation.

Continue reading " Replace Query tracked Aggregate views of an ASO cube to another cube and aggregate " »

October 20, 2017

Multi-Currency Reconciliation made easier in ARCS


Account Reconciliation Cloud Service (ARCS) is a tool that helps streamline a company's account reconciliation process by enforcing standardization and operational efficiencies through centralized repository of account reconciliations across all the segments and accounts.


ARCS supports both single currency and multi-currency environments. Single Currency setup is common in ARCS and hence, we will talk about Multi-Currency environment setup.


Enabling an environment for multi-currency is done using the currency buckets setup options. Three currency buckets are available: Entered, Functional, and Reporting.

  • Entered--Currency in which the transactions have occurred (Posted Currency)

  • Functional--Currency in which the company operates.

  • Reporting--Currency in which the financial statements are reported


Scenario 1: All three buckets are enabled

Legal Entity is in UK, transaction is happening in Germany and the company's headquarter is in USA. So, the functional currency would be GBP, entered currency would be EURO and the reporting currency would be USD.


Scenario 2: Functional and Reporting Buckets are enabled

NON US legal entity and the company's headquarter is in USA. In this case the functional currency would be the same as local currency and the reporting currency would be USD.


Scenario 3: Functional and Reporting Buckets enabled for one profile and only Reporting bucket enabled for another

One GL Account associated with one US entity and same GL account associated with one NON-US entity (Assuming that the reconciliations are being performed at account-entity combination). In this case GL Account-US Entity will have only reporting bucket enabled and the latter will have both functional and reporting buckets enabled.


A legal entity has to reconcile in reporting currency since the Financial statements are being reported in the this currency, but before that it has to be made sure that the balances are reconciling in the functional currency as well (or both Entered and Functional currency in case both buckets are enabled) as these are being converted to the reporting currency at a later stage for financial reporting. Also, in some cases, reconciling in Functional/Entered currency is required to fulfill the Local Regulations and Statutory requirements.


ARCS allows all 3 buckets to be loaded from GL through data integration. The currency rates are loaded after that. While performing reconciliations, the amounts for transactions for explaining the balance are entered in the lowest enabled currency bucket (currency in which the transaction has actually occurred). The functional and reporting currency balances are then calculated based on the currency rates loaded in the system for that month.


In case of a multicurrency environment, the Balance Summary for a reconciliation looks like below. It can be considered as a multiple representation of the same balance.



Stepwise Process:

  1. Enable applicable Currency buckets in ARCS. Tools->System Settings->Currency->Currency Bucket


    Enabling multiple Currency buckets in system settings, enables those currencies (Currency Bucket 1 for Entered Currency, Currency Bucket 2 for Functional Currency and Currency Bucket 3 for Reporting Currency) for all profiles by default. If not required to be enabled for all profiles, the setting has to be exclusively dis-abled for each ineligible profile by editing the profiles.


  2. Setup Data Integration for the Entered Currency Data files

  • Add Category Mapping for Entered bucket

  • Setup Import Format

  • Setup Location

  • Create Data Load Rule for Entered Category


  1. Load data in ARCS.


  2. Load Currency Rates in ARCS.




  • Check with your source system (EBS, Oracle Fusion, etc) admin if local currency and reporting currency set of books are different. Also, check if there are standard reports available for extracting local currency data files.

  • System Restrictions: System does not allow to edit the Reporting Currency field in case of addition of an Amortizing/Accreting transaction.



  • If there is a plan of updating an existing single currency ARCS environment to multi-currency at a later stage, make sure your data files for existing currency have the currency column. This will avoid extra effort on setting up data integration again for multi-currency data files

  • While adding mapping details under import format for data file, add 'currency row' instead of currency dimension.

October 16, 2017

BICS - A Cloud Offering from Oracle

What is BICS?

We are all familiar with the traditional BI services and our businesses rapidly increasing their investments in the Cloud Services. Oracle to stay afloat with the changing business demand had launched its cloud offering known as BI Cloud service.

Now, the question that will come to everyone's mind is what is Oracle BI Cloud Services (BICS)

Here it is,

"Oracle BI Cloud Services, an eminent cloud offering from Oracle, a platform for creating powerful business intelligence applications, thereby enabling users from the workgroup to the enterprise."

So, is BICS anyhow related to on premise OBIEE? The answer is Yes, BICS is just an evolution of the on premise Oracle Business Enterprise Edition (OBIEE) solution, altogether a new platform built around standard OBIEE.

Features of BICS?

  • Query on premise database and other cloud databases
  • Offers presentation services to create analysis, dashboards and visualizations
  • Embed JavaScript/HTML into Analyses and Dashboards
  • Enhanced data visualization
  • Multiple data loading options with connections to on premise database
  • It can be accessed through mobile without any extra programming

Why Choose BICS?

Oracle's business intelligence technology was brought to the cloud with the help of BICS, without any capital expenses, with less amount of money paid for monthly subscription, and implementation too was quick. BICS has an edge over on-premises business intelligence technology in a sense that it will be the first one to receive new and updated features with enhancements as Oracle will be following cloud-first approach. And, using BICS is advantageous for functional users are its easy self-service administration whereas for analysts they can just upload a spreadsheet from the desktop without any assistance from IT or an enterprise-wide application.

The advantages of BICS over on premise OBIEE are:


On premise

It does not have features like Agents, BI Publisher

Agents (ibots), BI Publisher features are available

BICS version of RPD does not support physical source aliasing

Supports features like physical source aliasing

Two environments available Pre Prod and Prod

No limit on environments

Does not support LDAP or MSAD

Supports integration with LDAP or MSAD

Is it beneficial to migrate from on premise OBIEE to BICS?

Migration will allow the administrators as well as the customers to enjoy the new features and functionality. But following needs to be taken care while migrating from OBIEE on premise to BICS:

  1. It is recommended to migrate to BICS from OBIEE 11g in a condition when subject areas and reports are not to be disabled on BICS alone but needs to be disabled in on premise OBIEE
  2. In order to match the subject areas and reports in on premise to those available in BICS, it is advisable to upgrade to OBIEE 12c for there are lot many similarities that both possess, such as
  • The version of the repository used will be same
  • Using a single file (with extension .bar) the web catalog, reports and dashboards can be migrated
  • Application roles follow the same naming convention in order to avoid maintaining a separate application role list

Overall, it's a perfect approach for OBIEE on premise users to move normal reporting to cloud in order to make use of new functionality and features, visualization of data, reduction in infrastructure and license costs.

Continue reading " BICS - A Cloud Offering from Oracle " »

October 4, 2017


Oracle provides multiple ways of reporting from Oracle Cloud (Fusion) Apps, one of them is Oracle Transactional Business Intelligence (OTBI). While, OTBI is built on the OBIEE platform there are few limitations as compared to the on premise OBIEE features. This blog will give a high level understanding of the OTBI capabilities and limitations on cloud, which will especially be helpful during the requirements gathering to see if the requirements are suited for OTBI capabilities, and during the design sessions to fit the requirements with the many tools that OTBI provides.


OTBI is included as part of the Oracle Cloud Apps licenses and does not need any additional setup. This includes out of the box reports, subject areas which are built to work with the Fusion view objects (VOs). In addition, there are tools for customizations namely, OBIEE Answers, BI Publisher. Also, fusion allows to create custom subject areas using the Application Composer tool.

Thus, OTBI aims to make reporting easier for Managers and Analysts by providing tools to simply drag and drop fields from subject areas, while also providing tools for complex customizations when required.

OTBI is attractive for customers for the following reasons:

·         All setup, including the security makes use of the fusion architecture. Hence, setup time is low.

·         Provides real time reporting capabilities.

·         Report customizations are possible with the OBIEE platform at no extra costs. Translates to huge value for money.


October 2, 2017

Hyperion planning Metadata Management using Smartview

Planning onwards, Oracle has provided great utility to perform BULK changes in Planning Metadata using Smartview, called as Planning Admin Extension (PAE). You can use this tool for your metadata management in planning.

What we have to do to use it?

                Before using the tool, we first have to install it (if not already installed).  Installable can be downloaded from Workspace -> Tools -> Install -> Planning Admin Extension

PlanningSVExtension.msi will be downloaded and its very easy to Install.

To confirm if the extension is installed, Open Smartview -> option Tab, under extensions 'Hyperion Planning Admin Extension'

What all can we do?
  •  Add bulk new members.
  •  Edit existing dimensions or bulk members.
  • Move members within Hierarchy.
  • Change member properties
  • Create or Refresh Cubes.

All of above task you can do it just using Smartview.

How can we do it?

Connect to Hyperion planning in Smartview (Use 'Oracle Hyperion Planning, Fusion Edition' connection from drop down).


2.       Drill down to required application and then to Dimensions. Right Click on dimension which you want to update and Click on 'Edit Dimension'.


On clicking on edit dimension, the new ribbon (Planning Ad hoc ribbon) will appear with extra buttons and the ad-hoc grid will be retrieved at root dimensions.

Now we can zoom in out to what all members we want to update and will be able to fetch below members properties for required dimension. Here I have selected version dimension.


So, let's try modifying/adding a couple of members in a dimension.

I have to perform tasks in version dimension.

·         Task 1: Change Data Storage for 'Draft version' to 'Never Share'.

·         Task 2: Add new Version Draft1, Draft2, and Draft3_new

·         Task 3: Move Draft1 and Draft2 under parent Draft version


Task 1: Edit Member property:

·         Choose required DataStorage from drop down for which you want to change member properties.


Task 2: Add new members

For adding  new members, we have 2 options as below :

·         Dimension Editor Mode

·         Submit without Refresh Mode

Dimension Editor mode option is faster than other but it requires refresh after each member addition. I can say it's a two-step process whereas Submit without Refresh is One Step process.

In Dimension Editor process, you will be able to see * (asterisk) sign on member after refresh. To do so, Just add member name you want to add at the end and press refresh. At this moment member will not be pushed to planning.

All other properties will automatically populate in smartview. Change whatever property you want to change and click Submit Data. Once you click SubmitData, you will be able to see members in Planning.

For using Submit without Refresh Mode. For using same, add a property "SMART_VIEW_DIMENSION_EDITOR_PARITY_MODE" set to true in Planning.

Once done, Come back to smart view add a new member, add all information and at last click on Submit without refresh. You will be able to see a member in planning.

Task 3: Move member

To change member in hierachy, come back to smartview. Select member which you want to change. Under Parent Member column, add new Parent. I added Draft version as per my requirement. Click Submit Data from planning Ad hoc. You are done 😊