Infosys’ blog on industry solutions, trends, business process transformation and global implementation in Oracle.

April 2, 2019

Uncover Dun & Bradstreet Data in Oracle Cloud with DaaS and CDM Cloud

Few weeks ago, there was a client requirement for which we had to respond. It was about updating the Dun & Bradstreet scores for their customers in Oracle Cloud Financials. It was not a straight forward requirement as lot of functionalities have changed and in Oracle Cloud Financials. I came across the "DaaS" concept in Cloud which allows a user to consume the entire historical data of a customer in to their own ERP systems.

Continue reading " Uncover Dun & Bradstreet Data in Oracle Cloud with DaaS and CDM Cloud " »

March 15, 2019

Oracle's Marketing Cloud Analytics Tool - Oracle Infinity

For any successful business, Customer is the King. However, in recent times, it has become imperative to reinvent the Customer Experience, without which the business cannot survive especially in sectors like Retail, Telecom, Hospitality, Tourism. Research by American Express found that 60% of customers are willing to pay more for a better experience.

Optimal customer experience is achieved when a business remembers Customers and treats them with attention and respect throughout their unique customer journey. Businesses should leverage Customer data to improve the customer experience strategy. It should be diligently analyzed to gain insights to understand Customer interactions in real time, capture customer feedback and predict appropriate actions which will help business to yield higher growth and better customer view. According to research cited by McKinsey, organizations that leverage customer behavior data to generate behavioral insights, outperform peers by 85 percent in sales growth and more than 25 percent in gross margin.

Continue reading " Oracle's Marketing Cloud Analytics Tool - Oracle Infinity " »

February 4, 2019

From Assisted-Care to Self-Care

A few weeks ago, I read the book 'Delivering Happiness' by Tony Hsieh, the founder of Zappos. Zappos was a pioneer in retailing footwear online. It was acquired in 2009 by Amazon for ~USD 1.2 Billion.

Zappos had a manic focus on customer service. Every customer call went to a service rep and the reps went to a great length to help the customer get what they wanted. They even had an instance of a rep helping a customer who called the Zappos number to order a pizza.

At the other extreme, there are organization with highly complicated IVR menus that hide access to the agent behind multiple layers. They do so presumably to save customer interaction costs. However, it significantly deteriorates the experience for the customer who really needs to reach out to an agent.

Most organizations would like to be in the middle of the customer care spectrum, so they could service the customer while meeting targets such as:

·         Reduced cost per customer care interaction

·         Scalability to handle higher volume of customer interactions

·         Reduced average handling time

·         Deflection of calls to digital channels (Chat Bot, Virtual assistant, Mobile apps etc)

·         First contact resolution etc


Continue reading " From Assisted-Care to Self-Care " »

January 25, 2019

To dynamically insert blank rows in RTF template

To Dynamically insert blank rows in RTF template

1)      Why we require Dynamic Blank rows in RTF template?
Few business requirements demand fixed BIP report tabular layout irrespective of number of rows displayed per page. For example if a query returns 32 rows and BIP layout per page is fixed to show 10 rows , report would span across 4 pages. The layout of the report should be intact though there are 2 rows to be shown on the last page. Row filler concept can be used to fill the table in RTF with blank lines dynamically.

2)      Steps to insert Blank rows in the template

·         Create a variable at Table header to select all the rows from the Group element.

·         Write a logic in the End for each tag to fix how many number of records that should come in each page.

·         Add an Extra Blank Row in the table for Row filler functionality

·         Write a logic to find how many number of blank rows need to be inserted in the table based on the number of rows in the group element

·         Write a logic to put a straight line at the end of the each page when data is floating to next page.


o   Create a variable at Table header to select all the rows from the Group element.

 Variable Creation 1.JPG

In the above screenshot, we have created a variable "rowlimit" to select all the records from the group element with below formulae.

<xsl:variable xdofo:ctx="incontext" name="inner_group" select=".//G_1"/>

From this formula, we can get the total number records that we need to display in the report.

o   Write a logic in the End for each tag to fix how many number of records that should come in each page.

Based on the requirement, if we need to show only 10 lines in each page then below formulae will help to fulfill the requirement.

<?if:position() mod 17=0?><xsl:attribute name="break-before">page</xsl:attribute> <?end if?><?end for-each?>

Fixed_Rows_To_Print 2.JPG

o   Add an Extra Blank Row in the table for Row filler functionality

In order to use the Row filler functionality we need to insert an extra blank row in the table as below.

Row Filler 3.JPG


o   Write a logic to find how many number of blank rows need to be inserted in the table based on the number of rows in the group element

Here comes the main part of the logic to find how many number of records to be inserted in the table based on the total number of records available in the group element.

Considering earlier scenario, we have 32 number of records in the group element and only 10 rows to be printed on each page. Below formula can be used to calculate how many blank records needs to be printed on last page.

<?for-each:xdoxslt:foreach_number($_XDOCTX,1,10 - (count($inner_group) mod 10),1)?>

No.of Dynamic Rows 4.JPG


From above formula, $inner_group is total records present in the Group element. Considering the earlier scenario, last page only 2 rows to printed from 10 rows, remaining 7 blank rows has be inserted by the above formula.

o   Write a logic to put a straight line at the end of the each page when data is floating to next page.

Additional requirement to turn Cell Boarders On and Off when data is floating to second page, then below formula can be used.

<?attribute@incontext:border-bottom;'0.01pt solid #000000'?>

From above 0.01pt is the thickness of the line, the more you increase the value the more thickness of line increase.

To achieve this create a variable in the Row filler section (Extra blank row) and use the above formula.

Cells Boarder 5.JPG

The cell boarders shall appear in the report as below.

Cell Boarder ON 6.JPG




December 17, 2018

Cross Pollination of Solutions across Industry Verticals


During a discussion at work, a friend and I got thinking about how solutions created for one industry can be leveraged in other industry verticals. I would like to share some thoughts around the same in this blog.

The typical evolution of COTS applications is as follows:


Most mature COTS packages are past the first two stages of evolution and are looking at maximizing their investment by expanding the applicability of their solutions across industry verticals.  

Let us look at how Billing Solutions that were created for the Telecom industry are relevant in other industries.

Continue reading " Cross Pollination of Solutions across Industry Verticals " »

November 12, 2018

Incremental Loads on In Memory Tables

Preparing the Database for In-Memory Capability:


Changing Database Compatibility Level:


This is needed for parallel queries and auto-update of statistics in the database.





Enables the database option MEMORY_OPTIMIZED_ELEVATE_TO_SNAPSHOT to avoid the need to use the WITH (SNAPSHOT) hint for ad hoc queries accessing memory-optimized tables.






  • Only one filegroup can be created for memory optimized tables in a database. All the table data and metadata is stored in this filegroup only.
  • Now, I am creating a filegroup named FACILITIES_FARM_DB_mod using statement:




  • At least a container should be added to this filegroup. We are now creating a container to the FACILITIES_FARM_DB_mod filegroup using below statement:



This is to add MEMORY_OPTIMIZED_DATA filegroup. This can also be done using UI in SSMS.

Create a schema:




Creating a Linked Server:


To get the data from into the in-memory tables, I preferred my source as another db in which the tables are present. This is a transactional database (OLTP) where records come in every second. All the records in these tables capture the update_dt also using which we will create the incremental strategy. To use the tables in this db to fetch the data, we need to link this db to our FACILITIES_FARM_DB using Linked Server setup in SSMS.

Linked Server can be added from Server Objects in SSMS.

Select the provider as SQL Server and give-in name and give-in the server details of your source db.

Using these steps, I created a linked server named FACILITIES_MASTER_DB_SERVER which would connect me to FACILITIES_MASTER_DB and browse/refer the tables in the master db.

Creating an In-Memory Table:


The points to ponder before we create memory optimized tables:

  • Changing the database compatibility level to 130 (Microsoft's suggestion for optimized tables)
  • Enabling the database option MEMORY_OPTIMIZED_ELEVATE_TO_SNAPSHOT (prevents locks when using ad-hoc queries)
  • Enabling MEMORY_OPTIMIZED option on the table
  • Set DURABILITY to SCHEMA_AND_DATA (keeps both schema and data of table in secondary disk)

Let's create a new in-memory optimized table called INFY_PMU_ORDER in schema INFY_SCHEMA.

The CREATE table script for memory optimized table is a regular script but we need to add two attributes in the last. Also, a non-clustered primary key index is mandatory.



                [INSERT_DT] [datetime] NULL,

                [UPDATE_DT] [datetime] NULL,

                [PROJECT_ID] [varchar](32) NULL,

                [ORDER_NUMBER] [varchar](4000) NULL,

                [TYPE] [varchar](100) NULL,

                [ORDER_TYPE] [varchar](30) NULL,

                [COMPLETION_DATE] [datetime] NULL,

                ) WITH 




Creating a Control Table:


A control table is used to maintain and monitor the delta records. It will typically have the load start time, load end time, status of the load (running/success). In the control table we use, we will be having a record for each table everyday with the typical fields and the count of updated records for each UPDATE and each INSERT. The structure of our control table is as follows:


                [SNO] [int] NULL,

                [BATCH_ID] [int] NULL,

                [CURR_ETL_RUN_DATETIME] [datetime] NULL,

                [LAST_ETL_RUN_DATETIME] [datetime] NULL,

                [STATUS] [nvarchar](50) NULL,

                [TABLE_NAME] [nvarchar](50) NULL,

                [UPDATE_COUNT] [int] NULL,

                [INSERT_COUNT] [int] NULL,

                [SOURCE] [nvarchar](20) NULL,

                [LOAD_TYPE] [varchar](30) NULL



We will be updating the record of table with the insert and update counts for every incremental load along with batch id and sno.

Stored Procedure with Incremental Logic:


This stored procedure is to implement incremental logic for all the in-memory tables. The source for these would be Datahub db and a database link must be created between the Datahub DB and the replicator db. Once the linked server is established, using a set of INSERT and UPDATE statements, we can implement the incremental strategy. The steps to follow for the table would be:

1.     Insert the full load record in the control table

2.     Get the max start datetime from the control table into a variable say @UpdateDt

3.     Insert a default record into control table with status 'RUNNING'

4.     Update the existing records in respective table based on the value @UpdateDt

5.     Get the rows count using @@ROWCOUNT and update the UPDATE_COUNT field in control table

6.     Insert the new records in respective table based on the value @UpdateDt

7.     Get the rows count using @@ROWCOUNT and update the INSERT_COUNT field in control table

8.     Update the relevant record in control table with status 'SUCCESS'

The above steps are repeated in blocks for as many tables as you need and thus the procedure is created.

Here is the script for this procedure:


CREATE PROCEDURE [dbo].[Sp_facilities_farm_incremental] 
SET nocount ON; 


/************************ INFY_PMU_ORDER ***********************************/ 
SET @UpdateDt = (SELECT Max(curr_etl_run_datetime) 
FROM   infy_schema.control_table 
WHERE  status = 'SUCCESS' 
AND table_name = 'INFY_PMU_ORDER') 

INSERT INTO infy_schema.control_table 
VALUES      ((SELECT Max(sno) + 1 
FROM   infy_schema.control_table 
WHERE  ( status = 'SUCCESS' 
OR status = 'RUNNING' )), 
(SELECT Max(batch_id) + 1 
FROM   infy_schema.control_table 
WHERE  table_name = 'INFY_PMU_ORDER' 
AND ( status = 'SUCCESS' 
OR status = 'RUNNING' )), 
(SELECT Max(curr_etl_run_datetime) 
FROM   infy_schema.control_table 
WHERE  table_name = 'INFY_PMU_ORDER' 
AND status = 'SUCCESS'), 

UPDATE tab1 
SET    tab1.insert_dt = tab2.insert_dt, 
tab1.update_dt = tab2.update_dt, 
tab1.project_id = tab2.project_id, 
tab1.order_number = tab2.order_number, 
tab1.type = tab2.type, 
tab1.order_type = tab2.order_type, 
tab1.completion_date = tab2.completion_date 
FROM   [INFY_SCHEMA].[infy_pmu_order] AS tab1 
INNER JOIN (SELECT [row_wid], 
WHERE  update_dt >= @UpdateDt) tab2 
ON tab1.row_wid = tab2.row_wid; 

SELECT @RowCount1 = @@ROWCOUNT; 

UPDATE infy_schema.control_table 
SET    update_count = @RowCount1 
WHERE  sno = (SELECT Max(sno) 
FROM   infy_schema.control_table 
WHERE  table_name = 'INFY_PMU_ORDER' 
AND status = 'RUNNING'); 

INSERT INTO [INFY_SCHEMA].[infy_pmu_order] 
SELECT [row_wid], 
WHERE  update_dt >= @UpdateDt 
AND row_wid NOT IN (SELECT row_wid 
FROM   [INFY_SCHEMA].[infy_pmu_order]); 

SELECT @RowCount2 = @@ROWCOUNT; 

UPDATE infy_schema.control_table 
SET    insert_count = @RowCount2 
WHERE  sno = (SELECT Max(sno) 
FROM   infy_schema.control_table 
WHERE  table_name = 'INFY_PMU_ORDER' 
AND status = 'RUNNING'); 

UPDATE infy_schema.control_table 
SET    status = 'SUCCESS' 
WHERE  sno = (SELECT Max(sno) 
FROM   infy_schema.control_table 
WHERE  table_name = 'INFY_PMU_ORDER' 
AND status = 'RUNNING'); 


To execute the procedure, go to:

FACILITIES_FARM_DB -> Programmability -> Stored Procedures -> spTConnectIncremental (right click) -> Execute Stored Procedure

Scheduling an SQL Agent:


Now schedule the procedure we have created in the previous step using SQL Server Agent in SSMS. Once a name is given to this job in General tab, schedule the timing and repeat properties needed in Schedules tab as shown below:


Creating Indexes on Memory Optimized Tables:


The nominal (CREATE INDEX...) statement doesn't work on in-memory tables.

Instead, we need to use

(ALTER TABLE... ADD INDEX...) syntax.




                HASH (PROJECT_ID) WITH (BUCKET_COUNT = 64);  -- Nonunique

Continue reading " Incremental Loads on In Memory Tables " »

November 9, 2018

Importance of CPQ for Industrial Manufacturing


Most Industrial Manufacturing organizations are dealing with constantly changing market dynamics & variables that make configuring accurate sales quotes a complex process. Existing pricing and quoting tools & processes across industries are disparate, stand-alone custom applications with repeated manual data entry in sheets and formula-based price estimates resulting in data duplication, inconsistent BOMs, inaccurate estimates with no historical insights, reporting & analytics. This repetitive, manual and tedious work for Sales to capture product requirements, price it right and present to their customers and often takes weeks & months before converting to order which results in lost opportunities and impacts sales efficiency.

A robust Configure-Price-Quote(CPQ) system is thus essential in today's dynamic and competitive markets in order to provide faster and consistently accurate estimates to your customers and improve Sales efficiency by automating the quoting process to capture and process all product configurations and pricing details in a single system.

The most important cog wheels to achieve a unified and harmonized CPQ process are:

  • Configurator: Industries today want to cater to specific business needs of their customers and thus need to provide their Sales with highly configurable products to generate unique sales BOM that can serve the purpose effectively. The configurator must be able to support complex catalog hierarchies and product selections in a guided flow bound by various rules to ensure accurate sales BOM is generated automatically. The configurator must also support logical bundling of products & associated components & services to allow different selling models to be readily adopted by Sales, distributors, resellers and partners.
  • Price Engine: A competitive and dynamic pricing strategy with pre-defined rules based on product attributes, service plans, price contracts or agreements etc. will improve the win probability and provide edge over market competition. It also ensures that discounts & margins are in control and go through systematic reviews and approvals to create accurate and winning quotes to end customers.
  • Quote Life-cycle management: A complete opportunity-quote-order (Quote to Cash) process to manage and track quotes or contracts all the way to order creation and fulfillment through ERP systems and is fully integrated end-to-end with an upstream CRM system for opportunity management. The system must be able to automatically generate quotes from pre-defined dynamic templates that can be tailored to present the quote information according to the currencies & languages preferences across different regions of the world.

Benefits of CPQ for Industrial Manufacturers

  • Improved Time To Market: Product catalogs and bundles are constantly changing keep with customer demands. CPQ enables you to logically organize, expand and update product catalogs, complex configuration & pricing rules ensuring your sales teams are always building accurate proposals and closing deals in time.
  • Multiple Sales Channels: CPQ creates more valuable customer experience through guided selling and dynamic pricing and also brings internal sales teams, partners, distributors and resellers onto a common selling platform improving margins by identifying more upsell and cross-sell opportunities.
  • Improved Sales Efficiency: CPQ equips and enables Sales teams with product knowledge and engineering expertise thus eliminate reworks and other costs associated with an inability to deliver as promised and also eliminates the need to manually enter data into multiple systems frees sales reps, allowing them to concentrate more on selling products and services to customers.
  • Maximize revenues with Improved margins : CPQ ensures optimal pricing and allows you to better manage promotions and discounts achieving maximum revenues and margins on every deal. It also provides easy and faster buying experience for your customers with less turn-around times and improved sales velocity.

In digital era, where customer experience (CX) is top-notch priority for industrial manufacturing organizations, CPQ technology is a must-have to drive sales productivity and customer engagement, while reducing costs to edge ahead of competition in the disruptive markets.


Other useful links:

Continue reading " Importance of CPQ for Industrial Manufacturing " »

October 26, 2018

Workforce management (WFM) -There is an app for it!!!

There is an app for it" - More often, than not is the response to a question.

As the workforce continues to evolve and transform, workforce management tactics and processes also need to embrace the coming trends. Mobility was the buzz word few years back, and is now being augmented by a new trend: Employee experience (EX).

EX can be summarized as Self-service with a personalized experience on a mobile device.

Questions that an organization needs to address to be able to create the ultimate EX app:

·         How do we empower our employees/managers to be more independent (so that they spend less time managing the business and more time growing it)?

·         How can we reduce the time spent on mundane daily tasks?

·         How can we help our managers plan better and spend more time with their teams than systems?

Some key tenets of an ultimate EX app:


Source: created by author

 "Less is more"!!! it is not about making a great all in one product. In fact, it's about how seamless the transactions are across products.

If an organization can enhance the employee experience, they will be able to create more involved and productive workforce that helps business achieve their goals.

A study from IBM and Globoforce found the connection between optimistic employee experience and more engaged work performance and reduced attrition. Companies with engaged employees outperform those without by 202%; customer retention rates are 18% higher when employees are highly engaged; and organizations that have more than 50% employee engagement retain more than 80% of their customers. Engagement is a key to every company's success and employee detachment amounts to loss of over $500 billion per year in US economy.


How does get one started on the EX journey?

The primary goal of the Employee Experience app is to create supplementary value for the users. Which is why the use cases being rolled out are of utmost importance.

Training is extremely important. While millennials will love to use a "cool" app, the entire workforce should be able to see value in adapting newer tools. Hence the message 'simplification and acceleration of administrative tasks' becomes critical.

The app itself needs to be highly interactive to allow all user transactions seamlessly across systems.

The cherry on top for a successful experience is customization capability of the app which will allow users to define their look and feel. 


View image 

Source: created by author

Let's look at couple of examples that would help understand this better.

Example 1: Daily punches - no click solution!

Solution: no click punches via mobile app. The tech in the background would leverage Geo fencing + biometrics (voice/face recognition) + employees registered mobile phone number

Detail: Employee walks into the office -> gets a push notification "hey, you have reached office would you like to punch IN" -> employee responds via voice "Yes, please" -> Punch is authenticated and recorded



View image 

Source: created by author

Example 2: applying time -off - talk to the app!

Solution: voice enabled time off requests. The tech in the background would leverage voice enabled Chatbot

Detail: Employee opens the app and conversation would be as below:

Employee: Please apply Vacation for tomorrow

Bot: Would you like to apply full day or half day vacation?

Employee: Full day

Bot: Please confirm with a yes or no. Applying full day vacation for *insert date*

Employee: Yes

Bot: Vacation request successfully submitted



View image 

Source: created by author

This is the classic "less is more" example allowing the user to have a seamless transaction while the tech running in the background is a complex combination of algorithms.

In addition to considering the framework/technology, it's also important to think about what is going inside the app. Mobile solution requires a new approach to content. When thinking about the app, remember that it shouldn't be a dumping ground for old webpages. 

Personalized Mobility is the future of the workforce, and an app will enable employers to connect to their employees in new and exciting ways.

As a result, they get an engaged and empowered workforce that spend less time managing the business and more time growing it. Thereby increasing customer retention rates by ~18%. Engagement is a key to every company's success and employee detachment amounts to loss of over $500 billion per year in US economy.

"App" IT - now!!!

October 24, 2018

OAC Data Visualization (DVD/DVCS) New Features

Oracle Analytics Cloud (OAC) is getting matured in the industry and gaining customer trust. So Oracle is adding more and more new and innovative features to make the product more usable and make more compatible in today's ever changing market.

There are number features has been added in the OAC DV version 4 and 5 which will provide whole new set functionalities that will enable OAC users to harvest features and functionality like OBIEE. We are going to discuss few of the new and exciting features in the below article with some live example:

·         Drill via Existing Column in Same Visulization

·         Drill to another Canvas or URL

·         Use one analysis as a filter for whole canvas

·         Pattern Brushing

·         Synchronized Visualization

·         Explain Feature

Continue reading " OAC Data Visualization (DVD/DVCS) New Features " »

October 22, 2018

Supply Chain Planning on Oracle Cloud for Industrial Manufacturers


Digitalization is changing our world in profound ways. Product development cycles are decreasing and delivery models are dramatically evolving. Software-enabled innovations are creating new service-based business models that are replacing existing products and re-ordering industry dynamics seemingly overnight. All kinds of industrial manufacturers have a common characteristic of globally distributed supply networks which is very complex. Process manufacturers these days are facing problems related to high fixed costs and a relatively inflexible manufacturing footprint. Discrete manufacturers also face challenges typically of high demand volatility and short product lifecycles. Therefore, all types of industrial manufacturers are feeling the heat and its imperative that they undergo a transformation to meet the requirements of the digital age.

Existing supply chain planning process of large industrial manufacturers is characterized by rudimentary applications that involve lot of manual data entry and number crunching is an even more tedious task. There is lack of transparency which leads to inconsistencies in data analysis and reporting. As a result, planners and users waste a lot of their time in trying to organize data in the format that's readable and actionable. This leads to increasing costs.


From the earlier disjointed planning systems which had Demand Planning, Supply Planning, Inventory Planning and Sales & Operation Planning operating separately, the need of the hour is to have all the planning systems on a single platform. The day in the life of a planner currently involves analyzing loads of data, modeling multiple scenarios and then acting on recommendations based on their analysis. This is a time consuming process and needs to be run on a single engine so as to assist the planner and make him more responsive. He should be able to simulate and run analytical models with a single click so that decisions can be made efficiently and faster.


Infosys over the years has worked with several Industrial manufacturers and currently many projects are still under way. Infosys consultants have years of experience across multiple client engagements. Infosys has the foundation in place and is ready to help clients navigate to their next. For industrial manufacturers the next step is to have an integrated suite of products with smart dashboards, Intuitive UI's so as to help the planner perform their day to day job efficiently and with minimum hassle of navigating across multiple ERP screens and applications.


                 Modern Best Practice


     Outsourced, In-House Manufacturing, Make-To-Order, Configure-To-Order and Make-To-Stock

The Infosys Industrial Manufacturing Solution is the first of its kind, fully integrated and configurable on Cloud which optimizes the day to day tasks and processes for a planner. It assists the planner to accurately do Demand forecasting and schedule the production plans effectively. The Demand & Supply Planning solution is fully configurable for Make-To-Stock, Configure-To-Order, Make-To-Order and OSP related workflows. This solution allows planners to quickly adapt and modify their production plans, sourcing details according to changing demand patterns. The In-built smart dashboards and analytical models help the planner to identify process-related bottlenecks, eliminate waste and optimize production. It operates on the lean principle of supply chain planning of minimizing waste. The Supply Chain planning on Cloud enables the planner to efficiently plan, simulate multiple scenarios with a single click and make decisions at a much much faster pace. Constraint based, capacity based planning can be easily modeled with the help of this solution thereby making other rudimentary applications redundant. For an Industrial manufacturer, the complex one to many bill of materials that is currently being configured across multiple warehouses can be easily modeled in Supply Chain Planning Cloud in the form of a Supply Network wherein multiple 'Make At', 'Buy From' and 'Transfer from' sourcing rules can be configured and assigned to the supply plan with the help of an Assignment set. Thus the configuration of the supply network and supply plan is also an easy task with the step-by-step task pane available. Demand Fulfillment dashboards, Demands-At-Risk and other custom exception messages can be easily configured in this Cloud solution, based on which planner will be able to make decisions faster and more efficiently. Supply Planning can help you simulate multiple business scenarios, be it capacity constraints or demand volatility or even shorter product lifecycles by making the plan flexible and agile..

The Kanban Planning, Min-Max based planning, Service parts planning are already planned to be rolled out in future releases of Oracle Cloud which will further enhance the flexibility of planning central cloud to other process and discrete manufacturers.

Subscribe to this blog's feed

Follow us on

Blogger Profiles

Infosys on Twitter