Infosys’ blog on industry solutions, trends, business process transformation and global implementation in Oracle.

December 17, 2018

Cross Pollination of Solutions across Industry Verticals

 

During a discussion at work, a friend and I got thinking about how solutions created for one industry can be leveraged in other industry verticals. I would like to share some thoughts around the same in this blog.

The typical evolution of COTS applications is as follows:

BlogImage2.png







Most mature COTS packages are past the first two stages of evolution and are looking at maximizing their investment by expanding the applicability of their solutions across industry verticals.  

Let us look at how Billing Solutions that were created for the Telecom industry are relevant in other industries.



Continue reading " Cross Pollination of Solutions across Industry Verticals " »

November 12, 2018

Incremental Loads on In Memory Tables

Preparing the Database for In-Memory Capability:

 

Changing Database Compatibility Level:

 

This is needed for parallel queries and auto-update of statistics in the database.

ALTER DATABASE FACILITIES_FARM_DB SET COMPATIBILITY_LEVEL = 130;

 

Enabling MEMORY_OPTIMIZED_ELEVATE_TO_SNAPSHOT Option:

 

Enables the database option MEMORY_OPTIMIZED_ELEVATE_TO_SNAPSHOT to avoid the need to use the WITH (SNAPSHOT) hint for ad hoc queries accessing memory-optimized tables.

ALTER DATABASE FACILITIES_FARM_DB

SET MEMORY_OPTIMIZED_ELEVATE_TO_SNAPSHOT = ON;

 

Add a FILEGROUP and CONTAINER:

 

  • Only one filegroup can be created for memory optimized tables in a database. All the table data and metadata is stored in this filegroup only.
  • Now, I am creating a filegroup named FACILITIES_FARM_DB_mod using statement:

ALTER DATABASE FACILITIES_FARM_DB

ADD FILEGROUP FACILITIES_FARM_DB_mod CONTAINS MEMORY_OPTIMIZED_DATA;

 

  • At least a container should be added to this filegroup. We are now creating a container to the FACILITIES_FARM_DB_mod filegroup using below statement:

ALTER DATABASE FACILITIES_FARM_DB

ADD FILE (name='FACILITIES_FARM_DB_mod2', filename='F:\data\FACILITIES_FARM_DB_mod2') TO FILEGROUP FACILITIES_FARM_DB_mod;

This is to add MEMORY_OPTIMIZED_DATA filegroup. This can also be done using UI in SSMS.

Create a schema:

 

CREATE SCHEMA INFY_SCHEMA;

GO

Creating a Linked Server:

 

To get the data from into the in-memory tables, I preferred my source as another db in which the tables are present. This is a transactional database (OLTP) where records come in every second. All the records in these tables capture the update_dt also using which we will create the incremental strategy. To use the tables in this db to fetch the data, we need to link this db to our FACILITIES_FARM_DB using Linked Server setup in SSMS.

Linked Server can be added from Server Objects in SSMS.

Select the provider as SQL Server and give-in name and give-in the server details of your source db.

Using these steps, I created a linked server named FACILITIES_MASTER_DB_SERVER which would connect me to FACILITIES_MASTER_DB and browse/refer the tables in the master db.

Creating an In-Memory Table:

 

The points to ponder before we create memory optimized tables:

  • Changing the database compatibility level to 130 (Microsoft's suggestion for optimized tables)
  • Enabling the database option MEMORY_OPTIMIZED_ELEVATE_TO_SNAPSHOT (prevents locks when using ad-hoc queries)
  • Enabling MEMORY_OPTIMIZED option on the table
  • Set DURABILITY to SCHEMA_AND_DATA (keeps both schema and data of table in secondary disk)

Let's create a new in-memory optimized table called INFY_PMU_ORDER in schema INFY_SCHEMA.

The CREATE table script for memory optimized table is a regular script but we need to add two attributes in the last. Also, a non-clustered primary key index is mandatory.

CREATE TABLE [INFY_SCHEMA].[INFY_PMU_ORDER](

                [ROW_WID] [int] NOT NULL PRIMARY KEY NONCLUSTERED,

                [INSERT_DT] [datetime] NULL,

                [UPDATE_DT] [datetime] NULL,

                [PROJECT_ID] [varchar](32) NULL,

                [ORDER_NUMBER] [varchar](4000) NULL,

                [TYPE] [varchar](100) NULL,

                [ORDER_TYPE] [varchar](30) NULL,

                [COMPLETION_DATE] [datetime] NULL,

                ) WITH 

        (MEMORY_OPTIMIZED = ON, 

        DURABILITY = SCHEMA_AND_DATA); 

GO

Creating a Control Table:

 

A control table is used to maintain and monitor the delta records. It will typically have the load start time, load end time, status of the load (running/success). In the control table we use, we will be having a record for each table everyday with the typical fields and the count of updated records for each UPDATE and each INSERT. The structure of our control table is as follows:

CREATE TABLE [INFY_SCHEMA].[CONTROL_TABLE](

                [SNO] [int] NULL,

                [BATCH_ID] [int] NULL,

                [CURR_ETL_RUN_DATETIME] [datetime] NULL,

                [LAST_ETL_RUN_DATETIME] [datetime] NULL,

                [STATUS] [nvarchar](50) NULL,

                [TABLE_NAME] [nvarchar](50) NULL,

                [UPDATE_COUNT] [int] NULL,

                [INSERT_COUNT] [int] NULL,

                [SOURCE] [nvarchar](20) NULL,

                [LOAD_TYPE] [varchar](30) NULL

) ON [PRIMARY]

GO

We will be updating the record of table with the insert and update counts for every incremental load along with batch id and sno.

Stored Procedure with Incremental Logic:

 

This stored procedure is to implement incremental logic for all the in-memory tables. The source for these would be Datahub db and a database link must be created between the Datahub DB and the replicator db. Once the linked server is established, using a set of INSERT and UPDATE statements, we can implement the incremental strategy. The steps to follow for the table would be:

1.     Insert the full load record in the control table

2.     Get the max start datetime from the control table into a variable say @UpdateDt

3.     Insert a default record into control table with status 'RUNNING'

4.     Update the existing records in respective table based on the value @UpdateDt

5.     Get the rows count using @@ROWCOUNT and update the UPDATE_COUNT field in control table

6.     Insert the new records in respective table based on the value @UpdateDt

7.     Get the rows count using @@ROWCOUNT and update the INSERT_COUNT field in control table

8.     Update the relevant record in control table with status 'SUCCESS'

The above steps are repeated in blocks for as many tables as you need and thus the procedure is created.

Here is the script for this procedure:

 

CREATE PROCEDURE [dbo].[Sp_facilities_farm_incremental] 
AS 
  
BEGIN 
      
SET nocount ON; 

      
DECLARE @UpdateDt DATETIME 
      
DECLARE @RowCount1 INTEGER 
      
DECLARE @RowCount2 INTEGER 

      
/************************ INFY_PMU_ORDER ***********************************/ 
      
SET @UpdateDt = (SELECT Max(curr_etl_run_datetime) 
                       
FROM   infy_schema.control_table 
                       
WHERE  status = 'SUCCESS' 
                              
AND table_name = 'INFY_PMU_ORDER') 

      
INSERT INTO infy_schema.control_table 
                  
([sno], 
                   
[batch_id], 
                   
[curr_etl_run_datetime], 
                   
[last_etl_run_datetime], 
                   
[status], 
                   
[table_name], 
                   
[update_count], 
                   
[insert_count], 
                   
[source], 
                   
[load_type]) 
      
VALUES      ((SELECT Max(sno) + 1 
                    
FROM   infy_schema.control_table 
                    
WHERE  ( status = 'SUCCESS' 
                              
OR status = 'RUNNING' )), 
                   
(SELECT Max(batch_id) + 1 
                    
FROM   infy_schema.control_table 
                    
WHERE  table_name = 'INFY_PMU_ORDER' 
                           
AND ( status = 'SUCCESS' 
                                  
OR status = 'RUNNING' )), 
                   
Getdate(), 
                   
(SELECT Max(curr_etl_run_datetime) 
                    
FROM   infy_schema.control_table 
                    
WHERE  table_name = 'INFY_PMU_ORDER' 
                           
AND status = 'SUCCESS'), 
                   
'RUNNING', 
                   
'INFY_PMU_ORDER', 
                   
NULL, 
                   
NULL, 
                   
'DATAHUB', 
                   
'INCREMENTAL'); 

      
UPDATE tab1 
      
SET    tab1.insert_dt = tab2.insert_dt, 
             
tab1.update_dt = tab2.update_dt, 
             
tab1.project_id = tab2.project_id, 
             
tab1.order_number = tab2.order_number, 
             
tab1.type = tab2.type, 
             
tab1.order_type = tab2.order_type, 
             
tab1.completion_date = tab2.completion_date 
      
FROM   [INFY_SCHEMA].[infy_pmu_order] AS tab1 
             
INNER JOIN (SELECT [row_wid], 
                                
[insert_dt], 
                                
[update_dt], 
                                
[project_id], 
                                
[order_number], 
                                
[type], 
                                
[order_type], 
                                
[completion_date] 
                         
FROM 
[FACILITIES_MASTER_DB_SERVER].[FACILITIES_MASTER_DB].[INFY_SCHEMA].[infy_pmu_order] 
 
WHERE  update_dt >= @UpdateDt) tab2 
        
ON tab1.row_wid = tab2.row_wid; 

    
SELECT @RowCount1 = @@ROWCOUNT; 

    
UPDATE infy_schema.control_table 
    
SET    update_count = @RowCount1 
    
WHERE  sno = (SELECT Max(sno) 
                  
FROM   infy_schema.control_table 
                  
WHERE  table_name = 'INFY_PMU_ORDER' 
                         
AND status = 'RUNNING'); 

    
INSERT INTO [INFY_SCHEMA].[infy_pmu_order] 
                
([row_wid], 
                 
[insert_dt], 
                 
[update_dt], 
                 
[project_id], 
                 
[order_number], 
                 
[type], 
                 
[order_type], 
                 
[completion_date]) 
    
SELECT [row_wid], 
           
[insert_dt], 
           
[update_dt], 
           
[project_id], 
           
[order_number], 
           
[type], 
           
[order_type], 
           
[completion_date] 
    
FROM 
[FACILITIES_MASTER_DB_SERVER].[FACILITIES_MASTER_DB].[INFY_SCHEMA].[infy_pmu_order] 
WHERE  update_dt >= @UpdateDt 
       
AND row_wid NOT IN (SELECT row_wid 
                           
FROM   [INFY_SCHEMA].[infy_pmu_order]); 

    
SELECT @RowCount2 = @@ROWCOUNT; 

    
UPDATE infy_schema.control_table 
    
SET    insert_count = @RowCount2 
    
WHERE  sno = (SELECT Max(sno) 
                  
FROM   infy_schema.control_table 
                  
WHERE  table_name = 'INFY_PMU_ORDER' 
                         
AND status = 'RUNNING'); 

    
UPDATE infy_schema.control_table 
    
SET    status = 'SUCCESS' 
    
WHERE  sno = (SELECT Max(sno) 
                  
FROM   infy_schema.control_table 
                  
WHERE  table_name = 'INFY_PMU_ORDER' 
                         
AND status = 'RUNNING'); 

    
RETURN 1 
END 

To execute the procedure, go to:

FACILITIES_FARM_DB -> Programmability -> Stored Procedures -> spTConnectIncremental (right click) -> Execute Stored Procedure


Scheduling an SQL Agent:

 

Now schedule the procedure we have created in the previous step using SQL Server Agent in SSMS. Once a name is given to this job in General tab, schedule the timing and repeat properties needed in Schedules tab as shown below:

 

Creating Indexes on Memory Optimized Tables:

 

The nominal (CREATE INDEX...) statement doesn't work on in-memory tables.

Instead, we need to use

(ALTER TABLE... ADD INDEX...) syntax.

ALTER TABLE INFY_PMU_ORDER ADD CONSTRAINT constraint_Unique_PI 

   UNIQUE NONCLUSTERED (UPDATE_DT DESC, PROJECT_ID); 

ALTER TABLE INFY_PMU_ORDER ADD INDEX hash_projectid 

                HASH (PROJECT_ID) WITH (BUCKET_COUNT = 64);  -- Nonunique

Continue reading " Incremental Loads on In Memory Tables " »

November 9, 2018

Importance of CPQ for Industrial Manufacturing

 

Most Industrial Manufacturing organizations are dealing with constantly changing market dynamics & variables that make configuring accurate sales quotes a complex process. Existing pricing and quoting tools & processes across industries are disparate, stand-alone custom applications with repeated manual data entry in sheets and formula-based price estimates resulting in data duplication, inconsistent BOMs, inaccurate estimates with no historical insights, reporting & analytics. This repetitive, manual and tedious work for Sales to capture product requirements, price it right and present to their customers and often takes weeks & months before converting to order which results in lost opportunities and impacts sales efficiency.

A robust Configure-Price-Quote(CPQ) system is thus essential in today's dynamic and competitive markets in order to provide faster and consistently accurate estimates to your customers and improve Sales efficiency by automating the quoting process to capture and process all product configurations and pricing details in a single system.

The most important cog wheels to achieve a unified and harmonized CPQ process are:


  • Configurator: Industries today want to cater to specific business needs of their customers and thus need to provide their Sales with highly configurable products to generate unique sales BOM that can serve the purpose effectively. The configurator must be able to support complex catalog hierarchies and product selections in a guided flow bound by various rules to ensure accurate sales BOM is generated automatically. The configurator must also support logical bundling of products & associated components & services to allow different selling models to be readily adopted by Sales, distributors, resellers and partners.
  • Price Engine: A competitive and dynamic pricing strategy with pre-defined rules based on product attributes, service plans, price contracts or agreements etc. will improve the win probability and provide edge over market competition. It also ensures that discounts & margins are in control and go through systematic reviews and approvals to create accurate and winning quotes to end customers.
  • Quote Life-cycle management: A complete opportunity-quote-order (Quote to Cash) process to manage and track quotes or contracts all the way to order creation and fulfillment through ERP systems and is fully integrated end-to-end with an upstream CRM system for opportunity management. The system must be able to automatically generate quotes from pre-defined dynamic templates that can be tailored to present the quote information according to the currencies & languages preferences across different regions of the world.


Benefits of CPQ for Industrial Manufacturers

  • Improved Time To Market: Product catalogs and bundles are constantly changing keep with customer demands. CPQ enables you to logically organize, expand and update product catalogs, complex configuration & pricing rules ensuring your sales teams are always building accurate proposals and closing deals in time.
  • Multiple Sales Channels: CPQ creates more valuable customer experience through guided selling and dynamic pricing and also brings internal sales teams, partners, distributors and resellers onto a common selling platform improving margins by identifying more upsell and cross-sell opportunities.
  • Improved Sales Efficiency: CPQ equips and enables Sales teams with product knowledge and engineering expertise thus eliminate reworks and other costs associated with an inability to deliver as promised and also eliminates the need to manually enter data into multiple systems frees sales reps, allowing them to concentrate more on selling products and services to customers.
  • Maximize revenues with Improved margins : CPQ ensures optimal pricing and allows you to better manage promotions and discounts achieving maximum revenues and margins on every deal. It also provides easy and faster buying experience for your customers with less turn-around times and improved sales velocity.


In digital era, where customer experience (CX) is top-notch priority for industrial manufacturing organizations, CPQ technology is a must-have to drive sales productivity and customer engagement, while reducing costs to edge ahead of competition in the disruptive markets.

 

Other useful links:

http://www.infosysblogs.com/oracle/2017/03/cpq_for_industrial_manufacturi.html

Continue reading " Importance of CPQ for Industrial Manufacturing " »

October 26, 2018

Workforce management (WFM) -There is an app for it!!!

There is an app for it" - More often, than not is the response to a question.

As the workforce continues to evolve and transform, workforce management tactics and processes also need to embrace the coming trends. Mobility was the buzz word few years back, and is now being augmented by a new trend: Employee experience (EX).

EX can be summarized as Self-service with a personalized experience on a mobile device.

Questions that an organization needs to address to be able to create the ultimate EX app:

·         How do we empower our employees/managers to be more independent (so that they spend less time managing the business and more time growing it)?

·         How can we reduce the time spent on mundane daily tasks?

·         How can we help our managers plan better and spend more time with their teams than systems?

Some key tenets of an ultimate EX app:


dt1.png

Source: created by author

 "Less is more"!!! it is not about making a great all in one product. In fact, it's about how seamless the transactions are across products.

If an organization can enhance the employee experience, they will be able to create more involved and productive workforce that helps business achieve their goals.

A study from IBM and Globoforce found the connection between optimistic employee experience and more engaged work performance and reduced attrition. Companies with engaged employees outperform those without by 202%; customer retention rates are 18% higher when employees are highly engaged; and organizations that have more than 50% employee engagement retain more than 80% of their customers. Engagement is a key to every company's success and employee detachment amounts to loss of over $500 billion per year in US economy.

dt2.png

How does get one started on the EX journey?

The primary goal of the Employee Experience app is to create supplementary value for the users. Which is why the use cases being rolled out are of utmost importance.

Training is extremely important. While millennials will love to use a "cool" app, the entire workforce should be able to see value in adapting newer tools. Hence the message 'simplification and acceleration of administrative tasks' becomes critical.

The app itself needs to be highly interactive to allow all user transactions seamlessly across systems.


The cherry on top for a successful experience is customization capability of the app which will allow users to define their look and feel. 


dt3.png

View image 

Source: created by author


Let's look at couple of examples that would help understand this better.

Example 1: Daily punches - no click solution!

Solution: no click punches via mobile app. The tech in the background would leverage Geo fencing + biometrics (voice/face recognition) + employees registered mobile phone number

Detail: Employee walks into the office -> gets a push notification "hey, you have reached office would you like to punch IN" -> employee responds via voice "Yes, please" -> Punch is authenticated and recorded

Flow:


dt4.png

View image 

Source: created by author


Example 2: applying time -off - talk to the app!

Solution: voice enabled time off requests. The tech in the background would leverage voice enabled Chatbot

Detail: Employee opens the app and conversation would be as below:

Employee: Please apply Vacation for tomorrow

Bot: Would you like to apply full day or half day vacation?

Employee: Full day

Bot: Please confirm with a yes or no. Applying full day vacation for *insert date*

Employee: Yes

Bot: Vacation request successfully submitted


Flow:


dt5.png

View image 

Source: created by author


This is the classic "less is more" example allowing the user to have a seamless transaction while the tech running in the background is a complex combination of algorithms.

In addition to considering the framework/technology, it's also important to think about what is going inside the app. Mobile solution requires a new approach to content. When thinking about the app, remember that it shouldn't be a dumping ground for old webpages. 

Personalized Mobility is the future of the workforce, and an app will enable employers to connect to their employees in new and exciting ways.

As a result, they get an engaged and empowered workforce that spend less time managing the business and more time growing it. Thereby increasing customer retention rates by ~18%. Engagement is a key to every company's success and employee detachment amounts to loss of over $500 billion per year in US economy.

"App" IT - now!!!

October 24, 2018

OAC Data Visualization (DVD/DVCS) New Features

Oracle Analytics Cloud (OAC) is getting matured in the industry and gaining customer trust. So Oracle is adding more and more new and innovative features to make the product more usable and make more compatible in today's ever changing market.

There are number features has been added in the OAC DV version 4 and 5 which will provide whole new set functionalities that will enable OAC users to harvest features and functionality like OBIEE. We are going to discuss few of the new and exciting features in the below article with some live example:

·         Drill via Existing Column in Same Visulization

·         Drill to another Canvas or URL

·         Use one analysis as a filter for whole canvas

·         Pattern Brushing

·         Synchronized Visualization

·         Explain Feature

Continue reading " OAC Data Visualization (DVD/DVCS) New Features " »

October 22, 2018

Supply Chain Planning on Oracle Cloud for Industrial Manufacturers

 

Digitalization is changing our world in profound ways. Product development cycles are decreasing and delivery models are dramatically evolving. Software-enabled innovations are creating new service-based business models that are replacing existing products and re-ordering industry dynamics seemingly overnight. All kinds of industrial manufacturers have a common characteristic of globally distributed supply networks which is very complex. Process manufacturers these days are facing problems related to high fixed costs and a relatively inflexible manufacturing footprint. Discrete manufacturers also face challenges typically of high demand volatility and short product lifecycles. Therefore, all types of industrial manufacturers are feeling the heat and its imperative that they undergo a transformation to meet the requirements of the digital age.

Existing supply chain planning process of large industrial manufacturers is characterized by rudimentary applications that involve lot of manual data entry and number crunching is an even more tedious task. There is lack of transparency which leads to inconsistencies in data analysis and reporting. As a result, planners and users waste a lot of their time in trying to organize data in the format that's readable and actionable. This leads to increasing costs.

 

From the earlier disjointed planning systems which had Demand Planning, Supply Planning, Inventory Planning and Sales & Operation Planning operating separately, the need of the hour is to have all the planning systems on a single platform. The day in the life of a planner currently involves analyzing loads of data, modeling multiple scenarios and then acting on recommendations based on their analysis. This is a time consuming process and needs to be run on a single engine so as to assist the planner and make him more responsive. He should be able to simulate and run analytical models with a single click so that decisions can be made efficiently and faster.

 



Infosys over the years has worked with several Industrial manufacturers and currently many projects are still under way. Infosys consultants have years of experience across multiple client engagements. Infosys has the foundation in place and is ready to help clients navigate to their next. For industrial manufacturers the next step is to have an integrated suite of products with smart dashboards, Intuitive UI's so as to help the planner perform their day to day job efficiently and with minimum hassle of navigating across multiple ERP screens and applications.


 

                 Modern Best Practice

 



     Outsourced, In-House Manufacturing, Make-To-Order, Configure-To-Order and Make-To-Stock




The Infosys Industrial Manufacturing Solution is the first of its kind, fully integrated and configurable on Cloud which optimizes the day to day tasks and processes for a planner. It assists the planner to accurately do Demand forecasting and schedule the production plans effectively. The Demand & Supply Planning solution is fully configurable for Make-To-Stock, Configure-To-Order, Make-To-Order and OSP related workflows. This solution allows planners to quickly adapt and modify their production plans, sourcing details according to changing demand patterns. The In-built smart dashboards and analytical models help the planner to identify process-related bottlenecks, eliminate waste and optimize production. It operates on the lean principle of supply chain planning of minimizing waste. The Supply Chain planning on Cloud enables the planner to efficiently plan, simulate multiple scenarios with a single click and make decisions at a much much faster pace. Constraint based, capacity based planning can be easily modeled with the help of this solution thereby making other rudimentary applications redundant. For an Industrial manufacturer, the complex one to many bill of materials that is currently being configured across multiple warehouses can be easily modeled in Supply Chain Planning Cloud in the form of a Supply Network wherein multiple 'Make At', 'Buy From' and 'Transfer from' sourcing rules can be configured and assigned to the supply plan with the help of an Assignment set. Thus the configuration of the supply network and supply plan is also an easy task with the step-by-step task pane available. Demand Fulfillment dashboards, Demands-At-Risk and other custom exception messages can be easily configured in this Cloud solution, based on which planner will be able to make decisions faster and more efficiently. Supply Planning can help you simulate multiple business scenarios, be it capacity constraints or demand volatility or even shorter product lifecycles by making the plan flexible and agile..


The Kanban Planning, Min-Max based planning, Service parts planning are already planned to be rolled out in future releases of Oracle Cloud which will further enhance the flexibility of planning central cloud to other process and discrete manufacturers.




October 18, 2018

Oracle Master Data Management - OCH

Introduction

Trending technologies and methodologies had led to tremendous growth in the IT landscape leading to the agility of the processes. In due course of adapting to the trending technologies, customer data integration and management becomes key aspect of the maintenance. The optimum solution to leverage the data integration and management is the Master Data Management tool. Currently in market there are numerous number of MDM tools available for the usage and most of them are in use as well depending on the compatibility of the applications being integrated.  

MDM Overview

Oracle provide their support to the master data management capabilities supporting as one of the leading product support enterprise.

Oracle Master Data Management drives results by empowering enterprises and their users with the ability to cleanse, govern and manage the quality and lifecycle of master data through its principles. Oracle's MDM is a comprehensive platform designed to reduce time to market, reduce the cost of compliance as well as improve the customer experience by optimizing marketing efforts and providing the scalable ways to align to the latest technologies in market, along with providing the answers customers need.

Oracle Customer Hub also known as Siebel Universal Customer Master/Siebel UCM is one of the Oracle's leading Master Data Management (MDM) offered solutions. OCH's package functionality at enterprise level enables the capability of managing the customer data over the complete customer business cycle i.e., 360 degree view of the customer: capturing data, standardization and correction/rectification of names/addresses; Ability to identify, merge duplicate records; enrich customer profile; ensuring the governance of compliance and risk policies; and the maintenance of a "single source of truth" ensuring the best version of the customer profile is available for the consumers

In case of any other platform other than Siebel, respective MDM utilities/tools can be used depending on the client requirements.

Benefits of MDM

Below diagrammatic representation explains the industry challenges along with MDM relevance in respective area

 

mdm.jpg

 

 

Salient Features and Approach

As mentioned earlier, Siebel UCM or Oracle Customer Hub (OCH) solution is the leading oracle MDM tool proposed which provides the below salient features:

ü  5 Fundamental Master Data Management capabilities such as Quality, Consolidate, Quality, Share, trusted data which play vital role in overcoming current industry challenges.

ü  Compatible to establish reusable frameworks for optimum and fast delivery of Siebel UCM/OCH services and batch loads.

ü  Out Of The Box (OOTB) Siebel features related to real time services such as data decay, merge, web services, business services can be enabled & utilized

ü  Provides batch load feature(s) such as IFB, EIM and staging tables need to be utilized for batch integration

ü  Data Quality features such as cleansing, standardization, matching, survivorship and cross referencing can be maintained with the help of oracle product OEDQ integrated to the OCH component

Solution Integration Overview

In order to ensure smooth, fast and optimum scalable delivery it is important to ensure appropriate frameworks are being defined during initial setup so that the respective interface can be extended to on-going consumers/subscribers to the OCH/UCM application.

The framework can be categorized into Real Time service, Batch Load, Data Quality.

(These frameworks can be established in other MDM tools also, as per the tool compatibility)

 

 

mdm1.jpg

 

Consumers/Source System(s) interact with UCM/OCH application either in Real Time

Services (RTS) or Batch Integration(BI) via respective middleware (s).

 

Real Time Services (RTS)

In the current world, as we are aware every customer expects the features to be readily available to them, which is viable by Real time services. Respective consumer can interact with OCH through middleware (e.g., say Enterprise Service Bus (ESB) or Enterprise Application Integration (EAI)).

The Incoming request (Inbound Service) from the consumer would be processed by the MDM tool, thereby responding back to the consumer with the processed data via outbound service which ensures the completion of transaction.

 

Batch Integration (BI)

        Respective source system might be having bulk load to the MDM component, which would be achieve through batch integration process via respective middleware (say e.g: Enterprise Integration Gateway (EIG)).

        Below are the features support in batch load:

·         In case if any of the data doesn't get updated/created via RTS then respective data can considered in batch i.e., more of backup option

·         Bulk loads of data also considered under batch flow

·         Source System(s) to retrieve the information via data mode

 

Oracle Data Quality (OEDQ)

                Oracle Enterprise Data Quality (OEDQ) is the primary utility for the Oracle data quality management process. It consists of several modules for customer data cleansing and matching. OEDQ capabilities are required to control, correct and maintain the quality of the customer data that are originated from multiple systems. OEDQ capability refers to functions such as customer search, customer de-duplication, customer matching, along with data standardization and data profiling.

      OEDQ is integrated with Oracle MDM tool (OCH) using Siebel data quality universal connectors. Considering the fact that OEDQ is third party utility, it can also be integrated with other MDM tools depending upon the pre-requisites and the compatibility of the system requirements.

 

October 16, 2018

Recurring Journal in Fusion GL Clould

 

Recurring Journals: Overview


 

Define recurring journal formulas for transactions that you repeat every accounting period, such as accruals, depreciation charges, and allocations. Your formulas can be simple or complex. Each formula can use:

  • Fixed amounts and account balances, including standard, actual amounts, statistics, and period-to-date or year-to-date balances.
  • Amounts from the current period, prior period, or same period last year.
  • Amounts in your formulas, including total balances, entered currency balances, or statistical balances.

You can quickly create recurring formulas by copying and modifying existing formulas. You can:

  • Define single ledger or multiple ledger recurring journal formula batches.
  • Create a recurring journal formula batch that contains recurring journal entries for different ledgers.
  • Define recurring journal formulas for your ledger currencies, entered currencies, and statistical currency.


 


Recurring Journal Type;

You can use the following types of recurring
journal entries to reduce data entry time and increase accuracy for journal
entries that repeat each period.



  • Skeleton: With no amounts

  • Standard: With set amounts

  • Formula: With variable amounts


Defining Recurring Journals


This example shows how to define and generate formula recurring journals that are automatically generated every period.


You must have:

  • A role that can access the Journals work area in Oracle General Ledger.
  • A duty that can access the Create Allocation Rules task.

The following assumptions apply:

  • The chart of accounts includes segments for company, department, account, subaccount, and product.
  • Account 1210 is the trade receivables account.
  • The period-to-date activity in account 1210 is 100,000 USD.

Here are the goals for this example.

  • Create a journal that populates a monthly allowance for bad debt based on 5% period-to-date activity in the trade receivables account.
  • Account 7730 is the bad debt expense account and account 1260 is the allowance for bad debt account.
  • A formula rule must be defined to generate the following journal entry for the period Apr-17 and thereafter.
    • DR 01.000.7730.0000.000 5,000 USD
    • CR 01.000.1260.0000.000 5,000 USD

This worked example includes details for the following tasks:

  1. Configuring the Formula Rule
  2. Creating the Runtime Prompt Variable
  3. Creating the Rule Set
  4. Generating the Allocation Journal


Configuring the Formula Rule


Create a formula rule.


  1. Navigate to the Journals work area.
  2. From the Tasks pane, click Create Allocation Rules.
  3. Navigate to the Administer menu option and then select Calculation Manager. Calculation Manager opens in a new browser window and a cube is highlighted based on the data access set selected in the Journals work area.
  4. Expand Essbase.
  5. Expand VF_USA_Accounting_Flexfield (your cube).
  6. Expand db.
  7. Highlight the Rules row, right-click, and select New from the menu.
  8. Enter the Rule Name: Special Bad Debt Allocation, accept the other defaults, and click OK.
  9. The Rule Designer opens in a new tab. Under New Objects, click, hold, and drag the Point of View object. Place it between the Begin and End nodes in the Rule Designer.
  10. Enter a Caption: Point of View.
  11. Perform the following steps to enter a Variable Dimension Value:
    • Click the Value field for Accounting Period.
    • Click the Actions icon and select Variable from the drop-down list. A new window opens.
    • Under Category, select Database from the drop-down list.
    • Click Accounting_Period.
    • Click OK.

  12. Perform the following steps to enter other member dimension values:
    • Click the Value field for another dimension.
    • Click the Actions icon and select Member from the drop-down list.
    • Select a member and click the Add icon to move the member to the Selections panel.
    • Click OK. Repeat for all dimensions to include in the Point of View.
    • In this scenario, the following are fixed dimension values:
      • Company: 01
      • Department: 000
      • Subaccount: 0000
      • Product: 000
      • Currency: USD
      • Currency Type: Total

    • Under New Objects, click, hold, and drag the Formula component and place it between the Point of View nodes in the Rule Designer.
    • Enter a Caption: Bad Debts Calculation.
    • Enter the offset member.
    • Click Next.


    In this scenario, the offset is defined as account 1260, the allowance for bad debt. The offset is child combination 01.000.1260.0000.000 when combined with the fixed member dimension values in the Point of View.


    1. Perform the following steps to enter the Formula member dimension value:
    2. In this scenario, the formula member dimension value is defined as account 7730. The bad debt expense is charged to child combination 01.000.7730.0000.000 and combined with the fixed member dimension values in the Point of View.

      • Click the icon for the formula field and select Member from the drop-down list.
      • Select the Account dimension value, highlight the row, and click the Select icon to move the value to the Selections panel.


      In this scenario, the goal is to calculate an allowance for bad debt based on the period-to-date activity in trade receivables account 1210. Trade receivable is child combination 01.000.1210.0000.000 when combined with the fixed member dimension values in the Point of View.

        • Repeat for the other formula member values and click OK when all formula members are selected.

      In this scenario, the following dimension values are selected. Selection of members for the following dimensions is required for the source in a formula component.

          • Scenario: Actual
          • Balance Amount: Period Activity
          • Amount Type: PTD




        • Multiply the formula expression by .05.

        • Click the Save icon.

        • Click the Validate and Deploy icon.

      Creating the Runtime Prompt Variable


      Create a runtime prompt variable as an optional component of a rule. When you generate an allocation based on a rule with a defined runtime prompt variable, you are prompted to specify a dimension member for the variable. The variable is used in the allocation calculation.


      For example, use a runtime prompt variable of Accounting Period, which prompts you to specify the period to use in the allocation calculation. A runtime prompt variable can be created once and used in multiple rules.


      1. Navigate to the Journals work area.
      2. From the Tasks pane, click Create Allocation Rules.
      3. Once the Calculation Manager opens in a new browser window, a cube is highlighted based on the data access set selected in Journals work area. To define the runtime prompt, select Variables under the Tools menu.
      4. Expand to the db under the cube, highlight the row, right-click the row, and select New from the menu.
      5. The Variable Designer opens in a new tab. A default value must be entered and the variable name can't contain any spaces. Complete the fields, as shown in this table.
      6. Field

        Value

        Name

        AccountingPeriod

        Type

        Member

        Dimension

        AccountingPeriod

        Default Value

        Apr-17

        RTP

        Selected

        RTP Text

        Enter Accounting Period

      7. Click the Save icon. The runtime prompt variable is ready for use.

      Creating the Rule Set


      Rule sets are created by combining two or more related rules together to enable sequential allocating of balances.


      1. Navigate to the Journals work area.
      2. From the Tasks pane, click Create Allocation Rules.
      3. After the Calculation Manager opens in a new browser window, expand to Rule Sets under the highlighted cube. Highlight the row, right-click the row, and select New from the menu.
      4. Enter the rule set name and click OK.
      5. The Ruleset Designer opens in a new tab. Expand to the db under the cube for which the rule set is created, expand the rules, and drag the rules under the rule set.
      6. Click the row for the rule set, click the Variables tab, and select Merge Variables.
      7. Merge variables means that common variables among all of the rules in the rule set are merged. You only have to select the runtime prompt value once when submitting the Generate General Ledger Allocations process.

      8. Click the Save icon.
      9. Click the Validate and Deploy icon.

      Generating the Allocation Journal


      Start the allocation process to create the journal entries that populate the account balances.


      1. Navigate to the Journals work area.
      2. From the Tasks pane, click Generate General Ledger Allocations.
      3. Select a rule or rule set and enter any runtime prompt values.
      4. Click Submit.
      5. The Generate General Ledger Allocations task submits four processes consecutively (three, if the Post Allocations option isn't is selected). The processes calculate the allocation, write the results to the GL_INTERFACE table, import the journal batches, and post the journal batches to the General Ledger.


      Scheduling Recurring Journals: Examples


       


      You can create processing schedules for recurring journal entries that have been defined in the Calculation Manager. Scheduling automates the monthly generation of the entries and speeds up the close process.


      You can define multiple schedules for each calendar in General Ledger. These schedules can increment by accounting period, based on any calendar defined. Schedules are shared across ledgers.


      Scenario


      In this example, you have created a reserve for bad debt recurring journal entry in the Calculation Manager. Now, add a recurring schedule to the entry to generate the entry once a month on the last day.


      1. From the Journals work area, select the Generate General Ledger Allocations task. The Generate General Ledger Allocations page opens.
      2. Select the rule: Reserve for Bad Debt.
      3. Specify Accounting Period: Blank
      4. Note: The Accounting Period field appears if you use a runtime prompt in your rule and select Accounting Period as the runtime variable.

      5. Select the Post Allocations option.
      6. Click Advanced.
      7. Click the Schedule tab.
      8. Select the Using a schedule option.
      9. Select a frequency of Monthly.
      10. Select Repeat: By date.
      11. Enter start and end dates.
      12. Click Submit.
      13. The following figure shows the Schedule tab on the Generate General Ledger Allocations process page. The options on the page are set to the values as directed in steps 7 through 10.

      14. The generation process runs at the scheduled time.

Oracle GL Account Reconciliation Cloud


Reconciliations ensure that a company's financial accounts are validated by checking to see if the balance in the account is correct. Oracle Account Reconciliation Cloud Service makes this process simpler and faster for companies by automating the process and helping users involved in the process collaborate effectively.


Reconciliations ensure that a company's financial accounts are validated by checking to see if the balance in the account is correct. Oracle Account Reconciliation Cloud Service makes this process simpler and faster for companies by automating the process and helping users involved in the process collaborate effectively.


Because account balances are valid at a point in time, and business conditions change, it is critical that reconciliations occur. In addition, companies face stiff penalties for failing to reconcile.


Oracle Account Reconciliation Cloud consists of two modules: Reconciliation Compliance and Transaction Matching.


Reconciliation Compliance


Reconciliation Compliance helps you manage account reconciliation processes, including balance sheet reconciliations, consolidation system reconciliations, and other reconciliation processes in effect.


Reconciliations can be performed at whatever level makes sense for the business. For example, you could perform some reconciliation by business unit or company code, while performing other reconciliations at the department level. An administrator can create mapping rules to assign the account balances to the reconciliations, and when balances are imported, ensure they appear in the correct reconciliation based on these rules.


The administrator sets up the reconciliation lists that contain the balances to be reconciled, as well as account descriptions, instructions, due dates, and completed dates. Email notifications are sent, reminding other users that due dates are approaching, or that reconciliations can be acted upon.


Transaction Matching


Transaction Matching is an integrated module of Oracle Account Reconciliation Cloud and the perfect complement to the existing Reconciliation Compliance feature set.


With Transaction Matching, companies can automate performance of high volume/labor intensive reconciliations, and seamlessly integrate those results into the tracking features within Reconciliation Compliance.


With the addition of this powerful new module, companies save additional time on the performance of reconciliations, while improving quality and reducing risk.


October 14, 2018

eSIM and its impact on the Customer Experience

 

Apple recently announced that the iPhone XS will support electronic SIM's. The eSIM significantly alters the experience of the customer in terms of how they interact with their devices. Before I get into the details, let me explain what is an eSIM.

An eSIM is a little piece of hardware that is embedded within the phone that can hold and manage the profiles needed to connect to a mobile network. eSIM's have a distinct advantage over physical SIM's.

  • An eSIM can be provisioned electronically by downloading a profile onto the user's device. Unlike a physical SIM, the user won't have to go to a store, buy the SIM and insert it into their device and wait for it to get activated. They can seamlessly manage the process online.
  • A user can have multiple eSIM profiles, allowing them to switch between the same. So while roaming, the user won't need to buy and swap physical SIM's. They could seamlessly switch profiles between their local and roaming providers.

Continue reading " eSIM and its impact on the Customer Experience " »

Subscribe to this blog's feed

Follow us on

Blogger Profiles

Infosys on Twitter