Infosys’ blog on industry solutions, trends, business process transformation and global implementation in Oracle.

November 15, 2017

Oracle Data Visualization Cloud Service

 

Oracle Data Visualization Cloud Service

 

Overview

Visual Analysis and Self Service discovery cloud service from Oracle. It has the following main features

  • Easy Upload

Upload data from a variety of sources (for example, spreadsheets, CSV files, Fusion Applications, and many databases) to your system and model it in a few easy steps

  • Simple Mash up

            Data from different sources is automatically connected

  • Easy Exploration

Create visualizations and projects that reveal trends in your company's data by creating insights and stories

  • Visual Experience

Auto visualization, Brushing and Intuitive, Filtering, Auto coloring, Built in Maps

 

Data Sources

DVCS can use data from the following sources for creating stunning visualization projects

  • Spreadsheet /CSV (<= 50 MB in size)

  • Data from Oracle Applications data sources

  • Oracle Transactional Business Intelligence Oracle BI EE analysis and subject areas

  • Data from a database as a service or on premise database using Remote Data Connector

  • Data from various on premise data sources such as (CSV, relational Databases, SQL query) can be uploaded and used in DVCS using Oracle Data Sync

  • Oracle Data Visualization Cloud Service REST API can also be used to programmatically load on-premises data to a data set that you can explore

 

Sample Hybrid Cloud Architecture with OBIA, OTBI and DVCS

OOB + Custom OBIEE Business Logic and reports

 

Or RDP

 

  • DVCS can be used to create visualization stories and projects directly using data from Oracle Applications data sources with Oracle Transactional Business Intelligence using Oracle BI EE analysis and subject area

  • Data extracted from the cloud ERP into an on premise data warehouse can then be used to create data visualization analysis and projects

 

DVCS Security Features

  • Identity domain

    • Identity Domain is a construct for managing certain features of Oracle Cloud including DVCS

    • The identity domain controls the authentication and authorization of users who sign in to Oracle Cloud services including DVCS

    • Several predefined roles and user accounts are available when DVCS is provisioned in an identity domain

    • If one has EBS on the cloud and uses DVCS to connect directly to OTBI subject areas. The 2 cloud services can share the same identity domain or it can be separate. The specifications of choosing the identity domain should be in the contract, but in theory both cases are possible

    • The roles in Identity domain are recognized in Web Logic

  • SSO

    • SSO is a token service for authentication while an identity domain allows you to manage users and roles.

    • One can integrate existing Oracle SSO with the identity domain of DVCS

  •  Application Roles

    • Comprises a set of privileges that determine what users can see and do after signing in to Oracle Data Visualization Cloud Service

    • There are two types of application role          

      • Predefined

      • User Defined

  • Data Level Security

One cannot control data visibility within DVCS but only control object visibility based on Application Roles/Users => meaning you can control which report a user can see but you cannot control what data the user can see in the report.  This is because you don't do any modeling within DVCS, so no data filters can be applied.   The data visibility is driven by the username used to connect to OTBI - whatever data this user sees will be visible within DVCS

 

Conclusion

DVCS is a powerful addition to the Oracle BI tools. It provides self-data discovery features with rich UI and visualization capabilities. It's a step in the right direction towards incorporation of the modern BI and Analytics features. It supports cloud and hybrid deployments supported. DVCS be used for root cause analysis data visualization project. Supported by Oracle's strong infrastructure, it provides an attractive option for Oracle's install base. It is not typically used for creating reports with large volumes or large number of columns. Using BICS, OBIEE, OAC one can develop tabular report with 500+ or 1000+ columns. In its current release, it is not possible to print or share (create for example a pdf and then send it via email) analysis created from the catalog those buttons are disabled.

 

 

References

https://docs.oracle.com/en/cloud/paas/data-visualization-cloud/bidvc/getting-started-data-visualization.html

 

November 9, 2017

Transaction matching of around two million records in under 5 minutes in ARCS

Oracle Account Reconciliation Cloud Service (ARCS) with Transaction Matching is a cloud based reconciliation platform with pre-built configurations and adherence to industry best practices; a recommended solution to cater to your reconciliation and matching needs.

Transaction Matching is a module within ARCS which inherits the features that facilitate preparation and review of reconciliations.

  • Additionally, Transaction matching adds efficient automation of the detailed comparison of transactions in two or more data sources
  • The calculation engine allows for intuitive "netting" of transactions within each data source to provide output which is easy to work with
  •  Flexibility in the setup and timing of the process allows to minimize the effort during "crunch time" and reduce risk

 

Transaction Matching Use Cases

Typical Transaction Matching Use Cases are shown below.

 

Use Cases.jpg

Often clients need to match more than million records between two source systems with complex match set rules. We have seen clients spending hours to try to manually match them in excel or use some solutions like Access database, Oracle tables etc. which can be very time consuming and have data quality issues. We will share our experience and some insights on how we successfully loaded and matched two source files with around 2 million records in less than 5 minutes using Transaction matching feature of ARCS for one of our e-commerce client.

Idea Inception

Client wanted to match up to 2 million records from their point of sale system (POS) and the details obtained from Merchant transaction system. They were using access data base for this activity which was giving them results in hours and they reached out to Infosys with this requirement to help them streamline this time-consuming and frustrating process.

 

Solution and Approach

Source Files.

1. Point of Sale transaction file.

    The POS file had 9 columns and the file provided was in txt format (a pdf report converted into a text file). Below is the snapshot of the same.

POS.jpg

2. Merchant system transaction file

           The Merchant system transaction file had 21 columns and the file was in csv format. Below is the snapshot of the file.

Merchant.jpg

Matching rules

Client wanted the matching rules to be based on the condition that the card number and the amount from POS transaction file matches against the cardholder number and amount from the Merchant transaction file with the stipulation of many to one transaction match where many transactions from Point of Sale system matches with single batch (grouped by amount) transaction from Merchant system file.

 

Initial Challenges

The initial challenges with this requirement are below

1. Size of File.

    The size of the files provided were huge as there were 9 and 21 columns respectively and both the files had around 2 million records resulting in file sizes of > 1 GB per file. This much large a file is difficult to read and edit by any text editor.

2. Formatting

    Another bigger challenge was formatting the given files as per ARCS transaction matching needs. The files provided were in text format and to read and format them given their file size was a tough nut to crack.

 

Infosys Solution

We took this challenge and delivered as promised. The biggest challenge was to import the file containing about 2 million transactions into the ARCS Transaction matching from both the system and match them automatically in quick time. Other tools and custom solutions were taking hours for this process. Importing 2 million records in a csv file is a huge input for any system to ingest. It would typically take anywhere between 15-30 minutes just to import one file into a system. We had another challenge in formatting the files because the file we received was a .pdf file converted into text format and we needed them to be converted into .csv to be accepted by ARCS Transaction Matching. We used Oracle ARCS TM, formatting tools, text editors and Oracle provided EPM Automate utility to format the files, automatically ingest and auto-match the files from two transactional systems.

 

The EPM Automate Utility enables Service Administrators to remotely perform tasks within Oracle Enterprise Performance Management Cloud instances and automate many repeatable tasks like import and export metadata, data, artifact and application snapshots, templates, and Data Management mappings.

 

Tips and Lessons Learnt

With the above requirement's implementation, we have learned a few lessons and below are some tips when implementing similar type of solution.

  • ARCS TM also accepts .zip format input files, hence compress the files into .zip format so that they are smaller in size plus quick and easy to upload on the ARCS cloud.
  • Powerful text editors like Notepad++ or Textpad when formatting the files, could be used.
  • Create custom attributes which can be used in matching rules for faster auto-matching of transactions.
  • If possible, try to get the export from the transactionsystems in .csv format to reduce conversion times.

Performance Metrics

Below are our performance metrics while implementing client's requirement of matching around 2 million records using Oracle ARCS Transaction Matching.

 

Import POS million records - 27 seconds

Import Merchant million records - 61 seconds

Run Auto Match - 53 seconds

 

Complete Process - 2 minutes 21 seconds (Less than half of 5 minutes)

 

Result?

 

Happy client and Happy us.

 

We deliver!!!! - Please visit our company website to know more about our Account Reconciliation and Transaction matching solutions.



November 8, 2017

Emerging Trend in SSHR- PART3

Continue reading " Emerging Trend in SSHR- PART3 " »

October 31, 2017

Oracle EBS to ARCS in one click - On Premise to CLOUD Automation made possible


If you are manually loading your data from source system to Oracle reconciliation system then this may be a cumbersome process which might include receiving the data files on email or on a shared folder, converting them to ARCS readable format, loading them to data management folder and running the data load.

We will talk about a "One Click "automated solution for this which we have implemented at several of our ARCS/ARM customers and also leveraged same for integrations of other Cloud modules


What to do?

Let's consider a common scenario where EBS is the source and ARCS is the target.

To automate the data load from source system to ARCS, below procedure can be followed:

  • Write a program in EBS (can be written in PLSQL), one for each data file, to run queries to generate the source data files. The program can be written to generate .csv or .txt files based on the import format configuration in data management. (Plan to keep the source data files to a shared drive through the same program). Data management comes along with ARCS and is used to import data into ARCS through data files.
  • For automating the ARCS manual tasks, EPMAutomate comes to the rescue. EPMAutomate is an Oracle provided utility to perform administrative tasks using REST APIs within Oracle EPM cloud instances. So, prepare an EPM Automate batch file to have
    • Commands to upload the source data files from shared drive to data management inbox folder. In this case, the "epmautomate uploadfile" command can be used (make sure that the user id used to login to EPMAutomate has access to the shared folder).
    • One command to run the data load in ARCS. In this case, "epmautomate importbalances" command can be used (make sure that the data load definition has been created in ARCS already and the definition has all desired locations selected).
  • When the batch file runs, data load, based on the definition, would trigger the underlying data load rules to run sequentially in data management.
  • The programs in EBS and the EPMAutomate batch file can both be scheduled to run daily/bi-weekly/weekly as per the requirement.


Below are our metrics from one such automation implemented.

Extract source data files from EBS through automated programs - 20 mins ; Reduced by 50% ; Manual effort of 'extracting the source data files one by one, converting each to .csv file and sending all files through mail' eliminated.

Data load to ARCS - 25 mins ; Reduced by 50% ; Manual effort of 'downloading the source data files from mail to local machine, uploading each file to data management inbox folder one by one, selecting the appropriate data file for each data load rule one by one in data management and running the data load from ARCS' eliminated.

 

Benefits -

No Manual Errors and high quality data loads!!

No dependency on availability of Expert!!

Better Performance!!


Unravel the mysteries around Equity Overrides

In this blog, I would like to share one of my experiences with HFM translation rules; it is still engraved, because it was a major issue that I had faced in the initial days when I had just started playing around with HFM & was keen on unravelling the mysteries that revolve around translation issues. During one of the implementations, I had a brief understanding of the historical rate concept, which we usually encounter to translate a subsidiary's equity.

So, before I proceed with the problem statement & the resolution, let me define historical rate for the beginners (especially who do not have a background in Finance, just like me) in the field of HFM. Historical rate is the exchange rate which was prevailing at the time of the transaction consummation. So, these transactions (mainly Investment & Equity) have to be translated at Historical Rate rather than using EOM Rate or AVG Rate. This is usually coming from clients who are reporting in multiple GAAPs, or I must say, US GAAP particularly.

Let me describe the issue in a practical example:

Org A invests USD 1 million in Org B, which reports in AUD, at an exchange rate of 0.7, hence making the subsidiary receive AUD 130,000. In A's books, the investment is USD 100,000; while in the books of B, there will be equity of AUD 130,000.

Now, in future, suppose the exchange rate becomes 0.8. Here, the translated equity for subsidiary, i. e, Org B becomes USD 104,000; whereas, for Org A, the investment still remains, USD 100,000 in the books. Hence, at the time of elimination the plug account will capture an imbalance of USD 4000, which actually is coming in due to the incorrect exchange rate being used for translation of the transaction. And the actual transaction is nowhere to be blamed for the mismatch. Hence, there is an urgent need for some solution to report the correct investment & equity in B's books, or else, the reported values would be incorrect.

Now the first thing, that struck me was, why don't we capture the translation through a rule which would take care of only the changes in equity & differences in A's investments during the entire month. Hence, this would automatically be taken care by standard EOM Rate, i. e, Balance Sheet rate, which is pre-defined in the application. But there was a gap here, suppose A invests in B on the tenth working day of the month. At his point of time, the rates are quite different.

Hence, the solution revolves around using Equity Overrides. But how to achieve this was another big hit. This would benefit the users, by rendering complete hold of the authentic translated values that were required to be shown in the Balance Sheets of the subsidiary organization.

We must be manually capturing the Historical rates for conversion through a separate Data Form defined in the application. The values then be translated using the historical rates using the rule file, overriding the actual rates. The difference arising  would be captured in a separate account, which we refer to as a Plug account for currency overrides, i. e, Foreign Currency Transfer Adjustments (FCTA)


Continue reading " Unravel the mysteries around Equity Overrides " »

October 30, 2017

Facing Authentication issue for ADLDS using ActiveDirectoryAuthenticator

 

In few Projects, Uses Active Directory Lightweight Directory Services (AD LDS), formerly known as Active Directory Application Mode (ADAM) and used to setup external LDAP for authentication in OBIEE.

In one of our project, we had scenario to configure LDAP authentication using Active Directory Lightweight Directory Services (AD LDS) in OBIEE. We have tried to configure LDAP type as ActiveDirectoryAuthenticator and faced issue to authenticate application. Sharing my experience below when configured LDAP as ActiveDirectoryAuthenticator and solution to Resolve issue.

ActiveDiretoryAuthenticator as Authenticator:

When configured ActiveDiretoryAuthenticator in Weblogic to access Active Directory Lightweight Directory Services (AD LDS) for authentication, authentication error is getting for valid user id even though AD LDS users and groups able to see in security realm.

Error:

The ActiveDiretoryAuthenticator uses attributes which are incompatible with the Active Directory Lightweight Directory Service (AD LDS). Specifically, the ActiveDirectoryAuthenticator uses the attribute User-Account-Control (UAC), which is used in the full version of ActiveDirectory, but is not used with the lightweight version. Therefore, the default ActiveDirectoryAuthenticator cannot be used with AD LDS.

Generic LDAPAuthenticator as Authenticator:

Microsoft ADAM is a lightweight implementation of Active Directory which does not provide all the services of the complete Active Directory provider. So we should not be using the Active Directory Authentication Provider to configure ADAM.

Solution for above issue to use the generic LDAPAuthenticator with AD LDS instead of using the ActiveDirectoryAuthenticator. Unable to start Admin server after configuring generic LDAPAuthenticator and getting below error.

Error:

Solution:

JPS will not support the generic LDAPAuthenticator by default, which will result in the server startup failure issue and JPS Error.

Add idstore.type property in the jps-config.xml file located under <domainhome>/config/fmwconfig to support the Generic LDAPAuthenticator to integrate WLS with the AD LDS Server and restart Admin Server service.

 

Note: take backup of jps-config.xml before modifying.

Analytics and the APP!

 

Welcome Back!!! In parts 1 and 2 we started out to understand the concept of analytics and the app (or analytics on a mobile platform) and review a few case studies from different leading products - Kronos, Oracle, and SAP. In this concluding part we will look at the significance of these case studies and draw inferences as to how they impact the world of analytics...

 

Inferences:

 

We have seen 3 case studies across different verticals with varying background and use case scenarios. However all have the common feature of using an analytics tool on a mobile platform to showcase the versatility of this combination of Analytics and the App!

 

When organizations go mobile with analytics, they are able to extend the reach of information and empower people from every aspect of their business with the facts to make better, more informed decisions.

This is evident from the 2015 Mobile Business Intelligence Market Study by Dresner Advisory Services:


  • Mobile business intelligence (BI) is the third most common use case for mobile business workers, next to e-mail and contact information

  • 90% of study participants said mobile BI is important to their business

  • In a surprising find by a Dresner market survey (*) Business Intelligence is of the 3rd  highest priority in Mobile applications, ranking higher than social media and even personal banking, coming in below only email and basic phone services.

 

*SOURCE - Wisdom of Crowds ® Mobile Computing/ Mobile Business Intelligence Market Study 2015

 

Trends observed during the research on case studies indicate the growing importance of Mobile analytics in different verticals - IT being the prominent horizontal across most of the industries. Some of the reasons for this are listed below:


  • Exploding usage of 'smart' mobile devices in general - personnel, org-wide, technological leap

  • Growing use of BYOD among enterprise employees - personnel get more opportunity to tap into the client systems and data as organizations open up accesses to employees.

  • Rapid use of mobile devices for other aspects of daily life - communication, mails, social media, entertainment - to make a convenient platform for including analytics.

  • Flexibility of usage and availability on-the-go. From being a straight-line process to being agile.

  • Advanced functionality of apps and devices - inducing enhanced parts and software.

  • Technology growth to aid predictive analysis and user data customization.

 

Suggestions/Future Prep: 


  • It is seen that the concept of mobile analytics is well known but almost negligible in application. This could be leveraged further to achieve Customer Delight.

  • The analytics functionality on ERP systems remains a niche area. Consultants could be empowered with training on this module to also include the mobile apps that are usually pre-built for such applications.

  • Another option to be explored would be provision of sample tablet devices (i-pad or android) to respective practices so as to enable learning, hands on and PoC processes.

  • From the case studies and also from project experience, it is observed that even though customers may be aware of the implications of mobile analytics on their processes, a PoC is helpful in all cases to create the right focus to open up further avenues of engagement.


Conclusion:

 

The advent of the mobile platform has been another epoch making event, probably making it to the top 20 inventions/events that changed lifestyles across the world significantly. Added to this event, parallel advancements in related areas like data analysis, cloud computing, big data, to name a few have been instrumental in converging the big with the best, giving rise to a concept such as mobile analytics. Since this concept is still in its nascent stage, it provides great potential for further exploration to discover the myriad use case scenarios and adaptability, which could lead to several success stories of - Analytics and the App!


 

End of part 3...hope you found this interesting - Please do leave your valuable feedback!

Part1 :  http://www.infosysblogs.com/oracle/2017/09/analytics_and_the_app_1.html

Part2 :  http://www.infosysblogs.com/oracle/2017/10/analytics_and_the_app_2.html

 

Analytics and the APP!

 

Welcome Back!!! In part 1 we saw an example of analytics being used on a mobile platform - tablet - to realize the retail store objectives and gain advantage of real time data updates. In part 2 let us take a look at more case studies across similar leading products...

 

Case Study 2:

 

Scenario - The client is a US based fortune 500 energy and utilities company with approximately 10 million customers. Their integrated business model provides a strong foundation for success in this vertical which is experiencing dramatic changes. They strongly believe in capitalizing on emerging trends and technologies to provide insight into operations that will drive value for their end customers

 

Background - The organization uses Oracle - one of the top ERP applications for their myriad business processes. As part of this PoC the Infosys team setup custom analytics solutions for the client. Oracle's business tool OBIEE 12c is used here to showcase the length and breadth of the analytics tool available as part of the wide array of modules in Oracle.

 

Problem Statement - The client needed to do a comparative evaluation between two mobile analytics applications as part of their PoC to be reviewed by their senior management.

 

POC details - The PoC was aimed at the OBIEE module's ability to work on a mobile platform. The PoC also aimed to do a comparative demo of features between Microstrategy (another analytics tool) and Oracle tools (apps). A set of commonly identified features was expected to be compared and in most cases, the feature was available within these tools but the enablement of the feature was different between OBIEE and Microstrategy.

 

Pilot & Feedback - For the pilot, the app was shared only among the senior management in the organization. The focus group was impressed to see that OBIEE could provide the features needed and appreciated the way it is achieved in OBIEE, which was different from their current applications. Further using OBIEE on mobile presented a very unique but scalable scenario as it proved to be a seamless extension to the existing suite of oracle products and which meant lesser chance of data integrity issues. Post the successful demo, client is now evaluating an option of a complete migration to OBIEE with preference to the analytics app as it aligns successfully with their established principles.

 

Being an energy and utilities company, it is always essential for the organization to possess the latest status and forecasts in a rapidly changing environment with unpredictable trends. With the analytics tool on mobile, it has brought the leadership very close to data and trends that were hitherto not feasible. Management can now make an informed decision much faster and just as easily track the results through OBIEE. Also, the time and effort saving is huge since it allows the stakeholders to pull their own graphs and data analysis, first hand and without chances of error. As the gap between technology, user and data/solution is greatly reduced leadership is also now very keen on applying this model to other areas of analytics.

 

Case Study 3:

 

Scenario - The client is a global tool manufacturing corporation with interests in construction, heavy equipment and technological solutions. They excel through outstanding innovation, top quality, direct customer relations and effective marketing. Client also has their own production plants as well as research and development centers in Europe and Asia. They pride in integrating the interests of all their partners, customers, suppliers and employees - into their growth and sustenance strategies.

 

Background - The client uses SAP package and tools for running their analytics platform integrating the various aspects of their business from planning to customer feedback & support. Combining areas like technology, design, real time feedback and automated order processing and metrics like quantity, geographical location, customer database, the analytics tool (SAP's BI system), provides the necessary inputs to the stakeholders to catchup on the latest available information/trend.

 

Problem Statement - The client needs an on-the-go platform to deploy their analytics solution to enable salesforce and service personnel to meet client demands as and when they arise in an automated fashion.

 

Introduction of Mobile Analytics - The organization has about two-thirds of its workforce employed directly for their customers in sales organizations and in engineering. They average about 200,000 customer contacts every day. This entails a constant need to be up to speed with the latest and greatest as regards the end customer data (or detail). A ready reckoner for this situation is the SAP mobile analytics (ROAMBI as it is known otherwise), that most employees in the organization use on a daily basis. Further, the entire solution is a cloud based model, so they have the best of both cases - cloud computing and mobile application. This has proved to be very advantageous to their on the job salesmen, technicians, customer support or even the top executives discussing an org-level strategy.


A real-life scenario involves the following situation:


  • A critical time bound customer order is not received at site on time.

  • However, the automated tracking tool, looking for the delivery report, has sensed it and raised an alert to the support center of the tools manufacturer.

  • This triggers the next set of established workflows in order to compensate for the delay in delivery.

  • Alerts sent to the nearest customer support personnel through a geo fencing feature enables the employee to locate the nearest outlet/warehouse/distribution center for the right part.

  • The support person raises a request under the right priority and is able to head over to the site to personally supervise the final delivery

All this has actually taken place on-the-go using the mobile device loaded with the BI tools and supporting applications to augment the corrective actions.


In this particular scenario, even the customer delight can be captured on the same mobile as feedback and, back at the corporate HQ, the top management will be able to gauge a real time heat map/graph showing customer satisfaction survey results that have been processed seamlessly through cloud.

 


End of part 2... in part 3 we will review the inferences and conclusion.

Part1 :  http://www.infosysblogs.com/oracle/2017/09/analytics_and_the_app_1.html

Part3 :  http://www.infosysblogs.com/oracle/2017/10/analytics_and_the_app_3.html


Continue reading " Analytics and the APP! " »

October 29, 2017

Migrate Oracle Hyperion Cloud Applications(PCMCS) on Autopilot!!! Using EPMAutomate


Migrate Oracle Hyperion Cloud Applications (PCMCS) on Autopilot!!! Using EPMAutomate


What is EPMAutomate?

EPMAutomate utility helps in automating administrator's activities for EPM Hyperion cloud products.


What is Migration and Why it is required ?

Migration of application in cloud is required to move an application from Test instance to Production instance and vice versa. Manual migration of application across the instance could take hours, it can be automated using EPMAutomate utility which subsequently reduce the time from hours to minutes. Migration of application includes Data, Metadata, Security, Data Management, Reports etc i.e every artifacts of application will be migrated using EPMAutomate utility without manual intervention. Migration can be server to cloud or cloud to cloud. It is always preferable to move backup artifact from server to cloud. Here example has been demonstrated with respect to PCMCS application .

Migration methods:


  1. Server to Cloud

  2. Cloud to Cloud


  1. Steps to automate Server to Cloud Migration from daily backups process using EPMAutomate utility in PCMCS


  1. Login into PCMCS Workspace by entering Identity Domain, User Name, Password and Click on Sign in



 

2. Delete existing application from instance if available, where new application will be migrated and imported from another instance.

Click Application->Application


3. Click on 'X' to delete application and click Yes


 

4. Now, modify the attached reusable sample batch script with relevant url and user credentials to automate Migration process using EPMAutomate utility

Sample Script:


@echo off

rem This script is used to perform On-Demand migration of Artifact Snapshot using

rem server backups i.e server to cloud migration

rem Update the parameters: url (Target url), Folder (source folder) and SnapshotName as per requirement

rem Please make sure application has been deleted from target instance before importing snapshot into it

SET url=

SET user=abc

SET password=D:\Oracle\password.epw

SET IdentityDomain=

SET Folder=D:\Oracle

SET SnapshotName=Artifact_Snapshot_09_13_2017_1322

SET UploadedSnapshot=%SnapshotName%.zip


call epmautomate login %user% %password% %url% %IdentityDomain%

timeout 10

call epmautomate uploadfile %Folder%\%UploadedSnapshot%

timeout 8

call epmautomate importsnapshot %SnapshotName%

timeout 10

call epmautomate logout           



5. Trigger .bat script :

Uploading relevant snapshot to cloud


 

 

6. Once migration completes, check for migration report in in PCMCS workspace.

Click Application->Migration->Reports->Migration Status

  

B) Steps to automate Cloud to Cloud Migration from daily backups process using EPMAutomate utility in PCMCS



  1. Follow steps 1 to 3 from section A

  2. Attached script to migrate artifact from one cloud instance to another

     

    'Copysnapshotfrominstance' command is used to move artifact snapshot across instances in cloud 

     

    Sample Script:

    @echo off

    rem This script is useful to migrate Artifact Snapshot from Test to Production instance 

    rem Update the following parameters based on requirement

    SET url=

    SET FromURL=

    SET user=

    SET password=D:\Oracle\password.epw

    SET IdentityDomain=

    SET SnapshotName=Artifact Snapshot

    call epmautomate login %user% %password% %url% %IdentityDomain%

    timeout 10

    call epmautomate copysnapshotfrominstance %SnapshotName% mdeshmukh %password% %FromURL% %IdentityDomain%

    timeout 8

    call epmautomate importsnapshot %SnapshotName%

    timeout 10

    call epmautomate logout


3.Rest of the steps are similar as in Section A

Reference(s)

https://docs.oracle.com/cloud/latest/epmcommon/CEPMA/epm_automate_command_links_pcmcs.htm#CEPMA-GUID-6BC610D3-03F0-41C4-8C52-FA1EE972D03F



 


 


October 27, 2017

Replace Query tracked Aggregate views of an ASO cube to another cube and aggregate

Essbase Aggregate storage Database is a Multidimensional Database. Aggregate storage enable dramatic improvements in database aggregation.

Aggregation:
Aggregation is a process of calculating and storing data of aggregate views to enhance retrieval performances.
For an ASO cube we can perform Default aggregation and Query tracking aggregation.

Continue reading " Replace Query tracked Aggregate views of an ASO cube to another cube and aggregate " »

Subscribe to this blog's feed

Follow us on

Blogger Profiles

Infosys on Twitter