Infosys’ blog on industry solutions, trends, business process transformation and global implementation in Oracle.

Main

June 18, 2017

The perennial question - OAF or ADF?

OAF (Oracle Application Framework) or ADF (Application Development Framework)? This is one of the most common questions from customers when we discuss the extension or customization strategy. The answer is very context specific and depends on the solution that needs to be developed.

 If we are talking about a solution that does not involve Oracle EBS (E-Business Suite) then the answer is probably ADF.

If the solution involves EBS, then we need to understand how much of the data involved comes from EBS, how much of the solution is dependent on the EBS configuration such as a user's responsibility, profile settings, etc and if we want an UI other than what we get in EBS.

The table below shows a comparative study of OAF and ADF and  the key considerations that should go into selecting the appropriate technology for our needs

Consideration

Advantage OAF

Advantage ADF

Data Involved

If the solution is an extension of EBS reading from EBS tables and inserting/updating them.

If the solution is only reading or writing less data from EBS which is relatively minor when compared to its overall functionality.

Access and control

Needs to use the users and responsibility from EBS to control functionality

Not dependent on EBS user access and control and needs its own

User Interface (UI)

Need the same UI experience as that of EBS

Need a different UI experience

EBS Application Object Libraries (AOL)

Solution heavily dependency on the EBS AOL such as profiles, value sets, lookups, concurrent programs, attachments, etc

Not much use of EBS AOL

 

For example, if we want to develop a custom solution to have payable's clerk to key in invoices that has validations against the PO, customer's data within EBS and the invoices need to be created in Oracle Payables, then OAF would be appropriate

If we want to develop a standalone custom solution to have payable's clerk to key in invoices that has validations against the PO, customer's data NOT from EBS or may be only customer data comes from EBS and the invoices needs to be created in a non-EBS application, then ADF would be appropriate

There is a perception that all new solutions in EBS needs to be built using ADF as this will make it easier during Cloud migration down the line. This is so not true. If the solution is considered part of EBS, then it has to be OAF and not ADF.

The architects of the Oracle forms era will probably understand this analogy the best- use of ADF is like developing a standalone application using Oracle developer suite using core forms and use of OAF is like using TEMPLATE.fmb and developing using the EBS standards. Both have powerful capabilities, the key is to understand when to use which!

June 4, 2017

Oracle GRC - Overview

In most organizations during Oracle ERP implementations there are many security, auditing and compliance related requirements where you would want to enforce Governance, minimize Risk and be complaint as per the company policies. GRC is one such module from Oracle which caters all such needs. 

GRC is a solution that manages business processes for greater efficiency, controls user access to reduce risk and track data changes to increase financial integrity.

GRC has 4 products:

Application Access Control Governor AACG - It helps to implement segregation of duties within an organization. Ex: A user who has can create a supplier cannot pay the supplier. This module can help identify such responsibilities and access to a user who has both the privileges

Transactions Control Governor TCG - It helps to prevent any fraudulent business transactions. For ex: if you have to identify a user who has created and made a payment to the supplier.  TCG can help detect such fraudulent activities

Configuration Control Governor -Maintains audit trail of configuration changes Ex: if someone has changed billing address, bank account details etc. Detects such setup changes,

Preventive Control Governor PCG - Enforce certain rules to prevent unauthorized actions or business transactions.

 View image

 In this article i will try to provide detailed features and capabilities of Preventive Controls Governor (PCG) module which is integrated within Oracle eBs.

PCG is a set of applications that run within Oracle E-Business suite as a component of the Governance, Risk, and Compliance Controls Suite (GRC).

 

PCG has several modules:

 

Form Rules:  Extend oracle eBs forms without modifying the seeded code and does not require much development expertise. Key things which can be performed using form rules

·         Prevent changes to designated fields

·         Restrict access to LOV's and Block/Fields based on OU, Responsibility, Username etc.

·         Make fields mandatory, hidden etc

·         Show pop-up messages, error messages etc.

·         Provides options to write SQL queries or DB procedures, packages.

 

Flow Rules: When there is a need to automate any business processes or perform sequence of activities within a process or need to enforce an approval process for any business transactions/setups in such cases flow rules can be used. Two steps to define a flow rule.

 

Launch Criteria: Trigger or an event based subscription types are the two options which needs to be created to initiate the rule.

Process Flow: Process flow needs to be defined according to the requirement. It has several options like

·         Provides SQL rules to perform DML operations, call oracle API's etc.

·         Provides Notification rules to perform approval or FYI notifications leveraging GRC workflows

·         Provides options to call concurrent programs or business events

 

Multiple rules can be used together. Ex: SQL rule, followed by approval rule, followed by notification rule etc. They can be defined sequentially within a process to perform the steps. Dependencies can be created between the steps like an approval rule has approve or reject options. Another process or step can be build based on approval/rejection.

 

Scenarios where there are no out of the box oracle options like Oracle AME, flow rules can be an option to explore which does not require much coding.

 

Audit Rules:  Enables Audit Trial on oracle tables. Audit changes to designated fields. Monitor changes to designated fields and email notifications based on 'Triggers' or pre-defined schedule

Change Control: Combines the functionality of Form, Flow and Audit rules

April 30, 2017

Reverse Logistics For A Forward Thrust To Sustainment Quotient

One of the key technology focus areas in Green Supply Chain Management that enables an organization to transition into a sustainable organization is Supply Chain Network.  Logistics Optimization also goes hand in hand with this. It is true beyond doubt that a responsive supply chain is also a responsible supply chain, it is more environmentally and socially responsible. Not only is it plausible it is also more financially viable resulting in a higher sustainment quotient as well as higher benefit factor. Logistics Optimization and Supply Chain Network can lead to reductions in empty and circuitous miles, and also increased warehouse capacity utilization. Organizations need to look at the process and operational best practices to improve upon their sustainment quotient. The higher the quotient, the greener the supply chain and hence the greener the dividend pastures as mentioned earlier.

Continue reading " Reverse Logistics For A Forward Thrust To Sustainment Quotient " »

April 19, 2017

Mergers and Acquisitions - Oracle SaaS to the rescue!

Mergers and Acquisitions (M&As) are time-tested strategies adopted by companies small and large to strengthen their market position and beat competition. In this process, parents usually acquire or merge with entities that are smaller or larger than themselves or of comparable size as theirs. Companies that get acquired could operate in niche areas, possess rare skills and be winners in their business arena. For the consolidation to be successful and yield results, business objectives & strategies need to align/complement, corporate/work cultures need to harmonize and business process synergies need to be realized

Continue reading " Mergers and Acquisitions - Oracle SaaS to the rescue! " »

March 29, 2017

Cross Docking - An Enabler to Quicken Turn-Around in QSR Industry

QSR Industry and the concept of quick turn around

The success of the QSR (also known as fast food industry) is driven by the timeliness with which the products are delivered, without compromising on the quality. The name itself, Quick service restaurants, suggests the fact that the delivery of products in this industry needs to be quick and the lead time involved in minimal. Organizations need to be ready to fulfill the orders in short notices which can even be a few hours. Also, given the dynamics and competition in this sector, along with the strict rules and regulations for any food products, organizations have an ever increasing pressure of quickening the delivery without any compromise in quality. Long term sustainment of growth and success are only possible if the above criteria are not neglected. Some very common QSR names like McDonalds, Starbucks etc have diligently followed these rules to become what they are today.

 

Nature of order placing in QSR industry

Most of the orders in the QSR sector which are placed to the warehouses comprise food products and preparations which are highly perishable in nature. Items like burgers, pastries, sandwiches etc which cannot be prepared and stored in the warehouse like other packaged items. These items are made to order and arrive in the warehouse only a few hours before the actual shipment needs to leave from the warehouse to fulfill the orders. Most of the QSR players have their company owned stores and hence the nature of orders is very similar for all stores. In ideal cases, these stores place their orders with the warehouse a day before the actual delivery.

The nature of the items is such that the warehouse doesn't have these items stocked in the premises beforehand and the order is passed to the manufacturing unit / supplier only after summing up the cumulative quantity of each item asked by all stores.

There are certain scenarios, especially during festive or holiday seasons when the stores place a lot of emergency orders with the warehouse where the lead time is only a few hours. These are the times when the warehouse has to ensure that there is quick turnaround of the order and needs to facilitate quick supply of the items from the vendor and then quick delivery of the same to the store.

 

Pain Points: How to Reduce Turn around

Given the nature of business for the QSR Industry, the efficiency and productivity are dependent on the fact that how quickly the turnaround is done or the turn around time reduced for the orders to the stores. However much the regular picking process is expedited, there is a fair bit of delay expected in the process of picking the items from the receiving area after they have arrived and bringing them to the shipping area. This can cause delay in the delivery deadline and business can suffer. Also, there is high labor cost and transportation cost which distribution centers want to reduce especially for these items where there is no storage and items are directly sent out for shipping.

 

Recommendation: Cross Docking

An optimum solution for this type of situation can be Cross Docking. Cross docking means skipping the receiving and temporary storage part and unloading the goods directly in the cross dock area from where they are loaded into the delivery trucks. This can serve as a good opportunity in case of QSR warehouses as the items themselves don't require storage. Hence they can be sent directly to cross dock area. Also, this will help in quick turnaround of orders from supplier till the actual store and increase the supply chain velocity.


For efficient cross docking, there are a few important points which need a special mention:

·         Physical layout of the warehouse: Cross docking will not make any sense if the distance between the receiving and the shipping area is very huge. Although in most warehouses, the layout is such that the receiving and shipping are at two ends of the warehouse. But that is more suitable to industries where storage forms a major activity in the warehouse. In case of QSR sector where storage is minimal, it would be ideal to have the receiving and shipping in close proximity to facilitate easy cross docking.

·         Carrier routing information: As mentioned earlier, the nature of orders from various stores in the QSR industry is very similar to each other. More often than not, the shipments are clubbed based on the carrier's route for the various stores. If this information is available much ahead of time, the items can be bulk picked from the cross dock area and put into the respective trucks as per the routing schedule.

·         Task management can serve as an effective enabler to avoid stock out situations and facilitate cross docking. Task management when linked to employee scheduling helps optimize staff requirements, based on sales history and other factors. This could even be linked to overall employee productivity and a number of reports could be made available to determine it. RF devices when paired with a WMS could be used effectively to avoid inventory shortfall situations for a retailer. What is needed is an inventory source of record, wireless infrastructure and a WMS with a Task Management engine.

 

There are a few impediments for the cross docking process as well. These are:

·         Inventory levels in the warehouse system: One major barrier in cross dock process is that during cross docking, the items are not actually received in the WMS system of the warehouse and hence the inventory levels are not brought up to show the entry and exit of the items from the warehouse. Since the orders are sent and pick / ship documents are created a day prior to the actual transit of items, the warehouse has to find a way to capture the information of these cross dock items arriving and leaving the warehouse for auditing and tracking purpose.

 

One way of doing this is to bolt up the warehouse inventory with dummy values via receiving screens and then bring them down through the pick screens.

The other way can be to make the cross dock location a valid receiving as well as shipping location of the warehouse. So from one screen, the inventory level in the WMS can be increased by receiving items from the cross dock location and from the other screen, the inventory can be brought down by shipping them out from the same location.

·         Merging of cross dock product with products coming out of picking belts: Though most of the items from the stores would be the highly perishable, JIT items, there can be certain items which are non JIT in nature and are stored inside the warehouse. For ex: packaged foods, liquids, lids, cups etc. These are picked and shipped via the conventional picking method and are brought to the shipping area via the conveyor belts or manually. If the warehouse is doing cross dock of some items which have arrived externally from the supplier and is also picking items from the shelves for the same order, care needs to be taken to properly pack all the items of a single shipment of a store together and nothing should get missed or mixed with other orders. Also timing of the arrival of items from the two sources becomes important in a way that the shipment doesn't get delayed.

 

A New Beginning

Cross docking has still not gained its fullest popularity in the world of distribution centers and is still a vast area to explore. Cross docking can greatly reduce the turnaround time of delivery of orders and can act as a trigger to increase the velocity of the logistics system of the QSR industry. No customer likes to hear that the dish or item they have ordered from the menu has still not arrived and will be there in a few minutes. Customer delight in terms of time and taste is the key to the success of this highly competitive QSR world and can only be achieved with the combined efforts of all participants in the supply chain. Cross dock can be once such contribution from the distribution center.

Continue reading " Cross Docking - An Enabler to Quicken Turn-Around in QSR Industry " »

March 22, 2017

***Chart of Account (COA) Design Considerations***

Chart of Account (COA) structure is the heart of an ERP implementation enabling business to exercise its day to day operations. This has very influence on how an organization wants to record monetary, contingent and statistical impact of different transactions taking place across the line of businesses, report it out to external entities to fulfil regulatory and statutory requirements, leverage it internally to gain insight on performance of different departments on both top and bottom lines. In order to be able to embark efficiently on these essentially require a modern chart of account mapped to different business modalities and dimensions that does not only takes care regular requirements as said but helps facilitate automation, rein in need of creating duplicate segment value pool, one segment does not override others i.e. maintains uniqueness of purpose mapped to each segment etc. Investing enough to lay down the foundation of COA structure would be the first step to lock down a successful ERP implementation and to drive innovation for businesses throughout the life of application. Note: Combination of segments (e.g. Company, Department/Cost Centre, Account etc.) forms a Chart of Account.

There are numerous essential characteristics including, but not limited to, below 5 that must be considered while designing COA structure:

Selection of business modalities/dimensions as segments of COA:

The selection of modalities as segments is not an objective matter but a very subjective in nature. While some are mandatory one irrespective of everything and anything but some are invariably vary based on types of industries, organizations and products or services offered, geographies where businesses have its operations, internal and external reporting needs, future considerations and volume of inter or intra company transactions etc. Each one of these are key drivers to design an idealistic, futuristic and holistic chart of account. For an example, manufacturing organizations may want to consider cost type as a segment to represent say fixed and variable cost in order to better assess contribution margin at the product level. They may look at a segment exposing sales destination location of a product to clearly articulate the strategy for multi-fold growth in determined geographies. In banking industry, companies may choose to introduce reference to a relationship manager/cost centre in order to measure performance at product portfolio level. In retail industry, looking at product categories instead of individual product can be the favourable option.

One segment should not override or make other ones redundant:

This is one of the vital discussion points while designing a COA structure in any ERP systems. While a thought leadership on this can offer long term benefits to organizations in account of easier maintenance, minimal master value pool for each segment, no duplication etc. On the other hand immature decisions, however, may erode the benefits eventually. A COA structure and value set for each segment should intelligently be designed in such a way that one segment does not make other one redundant, does not enforce introduction of similar type of values for a segment and most importantly they must be structured "relative" to each other. To understand it better, let's take an example of a COA structure that has 4 segments called Company, Cost Centre/Department, Natural account and Sub-Account. There are 3 companies COMP1, COMP2 and COMP3 and each company operates with its 4 own departments as Sales, IT, Purchase and Inventory. As a strategic and sustainable approach, a) one would recommend only 4 different cost centre value sets representing each of the 4 departments. These 4 can be associated with either of the 3 companies while actual transactions are taking place. On the other side as a poor design, b) organization can undoubtedly be enforced to introduce 12 different cost centre codes representing 4 departments working for 3 different companies. It is self-evident that option "a" firstly cascades the behaviour of relativity where Cost Centre is relative to a company and thereby does not lead to a redundancy and secondly avoids creation of duplicate codes for similar type of departments. This can further be well understood with postal code numbering system where it navigates through State, District and finally City. Here City is relative to a District and a District itself relative to a State for a given country. In regards to option "b", shortcomings are clearly countable as creation of duplicate codes while departments are of similar nature for each company, can't share segment values, certain to experience huge volume of cost centre values over the period of time etc.

Automation for Intra/Inter Company Transactions:

Organizations like GE who has leading business presence almost all over the world deal with huge volume of transactions b/t two or more internal business units. Transactions taking place b/t 2 business units ideally lead to inter/intra company transactions and that is where it is essential to consider a placeholder for inter/intra company segment in the COA in order to efficiently track referencing inter/intra company and enable opportunities for automation. ERPs like Oracle Application R12/Fusion Cloud offers an automation to create inter/intra company accounting entries by introducing pre-configured rules. For example, Oracle Fusion Financials automatically creates Intercompany Payable accounting entry corresponding to the Intercompany Receivable inter/intra company accounting entry by looking at the rules. Such entries have a counterparty reference in the COA code combination as in company (balancing segment) and designated inter/intra company segment.

Give meaning to each digit/character within a segment rather than just treat as code:

While a business meaning is tagged to each segment, a COA design can further be advanced by injecting an appropriate meaning to digits or characters within a segment. For example instead of just coding a company as COMP1 with no meaning to individual or set of characters, one can strongly advocate for "013060" where first 2 digits represents Country, next 2 region and last 2 State. Such logical combination may take away the need of an individual segment in a COA to signify location. This is additionally very helpful for easy reference.

Business Rules With Valid COA Code Combinations:

In regular business practice while creating different transactions, allowing only valid COA code combinations is usually the core business requirements. For example, although a COA code combination with Cash Account does not require any specific product code however the same would be needed while booking revenue. Thus, identification of such scenarios and implementing rules accordingly in the system is the key to rein in undesired code combination values.

Oracle Financial Accounting Hub (FAH) - A True Value Enabler

For any business organizations, recording accountings for its different transactions taking place with internal or external entities is an obvious objective. It is essential to measure the overall performance of the organization, gain insight to penetrate the new markets and to control cost expenses, fulfil the statutory and regulatory reporting requirements and so on. To efficiently support all these any modern organizations need a reliable, scalable, centralized fulfilling global and local accounting requirements, quick enough to implement a change and importantly economical solution. The answer is Financial Account Hub (FAH) and embarking on it is a first step to plant a foundation for innovation. FAH is an intuitive accounting engine that consumes business transactions' attributes interfaced from legacy systems, apply the accounting rules and mapping to eventually create accounting entries. For a better reference and understanding, it is similar to Sub-Ledger Accounting (SLA). While SLA is an accounting processor for business transactions originated from different sub-ledgers like AR, FA and AP etc within Oracle ERP, FAH is to deal with transactions originated from legacy systems and interfaced to Oracle ERP. Here are the 5 key value enablers that innately help drive organizations to inject FAH in their accounting solution footprint:

 

Centralized Accounting Solution:

In a traditional approach, consider a scenario where accounting entries are created for 10 different types of business transactions in 10 different front-office systems and finally interface it to Oracle where general ledger operation is supervised. This apparently counts some of the inefficiencies like:

a) Maintaining business accounting rules in 10 different systems

b) Requiring multiple resources with different product specific skills to implement accounting solution, change and support.

c) Lack of governance and control over accounting

d) Lost opportunity of reusing different components e.g. mappings and common accounting rules.

e) Have to invest on front-office applications for something which they don't primarily mean to do.

To overcome all these, FAH is one of the best options that offers centralized accounting engine empowers organizations cultivating a strategic roadmap to consolidate the accounting solutions lying at different places to just one at enterprise level.

 

Quicker and easier implementation:

Unlike Oracle EBS 11i and prior lower versions, both Oracle EBS and Oracle ERP Cloud offer front-end configurable capabilities to mimic business accounting rules on FAH setup components to eventually derive accounting entries for interfaced business transactions. Configurations are simply divided into logical groups likes Accounting Derivation rules, Journal Line Type rules (Dr and Cr) and optionally line/header descriptions rolling up starting from transaction, application, accounting method and finally to ledger. All these are configured corresponding to its relevant entity, event type and class model. An accounting solution for an interface can be ready in one month or so. 

 

Minimize dependencies on IT teams for maintenance:

Unlike custom accounting solution, most ongoing maintenance requests like capturing additional details to the journal line description can easily be achieved without even involving developer and a code change. Consider another scenario where there is a regulatory requirement to book asset expenditure to expense account instead of asset account for certain asset categories. Unlike in traditional back-end accounting engines where a medium size IT project may require, FAH can deliver it to business as part of the BAU processes without involving IT teams and notably in a quicker, easier and cheaper manner. In this particular case, accounting derivation rule will require a change to accommodate expense account for certain asset categories.

 

Capability to handle exceptions and complex mapping/rules:

While FAH is capable of handling most of accounting requirements with out-of-the-box configurable features, it also provides a powerful custom source concept where you can code your own accounting logic and link it to a custom source available for use in FAH. Consider a scenario where you want to derive BSV (balancing segment value) of COA based on the complex mapping and exceptions, a custom source can be defined for the same linked to a custom s/w code. FAH invokes the custom source at run time while interface processing to derive the BSV based on the logic coded in the custom s/w program.

 

Cost avoidance:

With FAH is in place for interface processing, organizations can avoid multiple licensing cost by eliminating the need of licenses for all front-office applications having its own accounting engine. It naturally avoids the salary costs needing for product SMEs with different skills set related to core legacy systems.

 Thus, FAH is categorically a strategic accounting hub be it Oracle EBS or Oracle ERP Cloud that offers agility extensively enabling modern organizations gain radical benefits of faster responsiveness to the regulatory and statutory accounting requirements, cost effectiveness, and importantly consolidation of accounting solution on a single platform.

February 10, 2017

Centralized Vs Decentralized VCP Architecture

 

One of the critical decisions that businesses considering VCP implementation have to make is to choose between the centralized and decentralized architecture of VCP. This decision is very crucial not just from the operational perspective once they have implemented, but also due to the fact that the cost of the overall project is dependent on this. For a decentralized environment, business have to invest in new infrastructure and hardware required for the new VCP instance. For smaller businesses, these costs could be higher than the overall implementation cost itself.  Through this article, I would like to discuss the pros and cons of each of those approaches and throw light on the aspects which businesses need to consider for making an informed decision.

A centralized architecture is where both the EBS and the VCP reside on the same server. In a decentralized architecture EBS and VCP reside in two different servers connected through the database links for exchange of data.

Before talking about the pros and cons of these architectures, let us understand the need for a decentralized architecture when by default we have the centralized architecture enabled.

Unlike most of the transactional systems, where the transactions are done at database level, the planning in the VCP modules happen in the application server memory. The planning engine processes are highly memory intensive and the plans require great amount of memory while the planning engines are running.

One of the most common issues encountered in ASCP (one of the important module of VCP), are the plan failures related to the application memory where the plans fail after exhausting all the dedicated/available memory.  These errors could be caused by an inappropriate setup in EBS or even by manual errors as simple as creating an internal requisition with inappropriate source. These transactional errors takes a lot of process maturity and user knowledge\training to control but still very difficult to avoid. What this means is that in case the application sizing wasn't done scientifically or in the case of above errors, the planning engine run impacts the performance of all the applications that reside on that server.

Most of the businesses going with centralized architecture face challenges during the month end\period closure activities where the finance processes (which process a huge volume of data) overlap with the planning processes

Also the amount of memory consumed depends on multiple factors such as volume of finished goods, depth of BOM, volume of transactional data , type of plans being run, amount of constraints and the list continues. In our experience we have seen businesses where the plans have a peak application memory consumption of over 64 GB. What this means is that an unscientific application sizing would not just impact planning but the activities in transactional modules in a centralized environment.

For businesses which have operations spread geographically across globe and have multiple plans catering to different geographies, it is imperative that they run those plans at different times in a day meaning the server resources need to be made available at all points in time such that the plans complete smoothly.

Having said that below are the pros and cons of the available architectures:

Centralized

Decentralized

Pros:

·     Lesser investment in infrastructure and its maintenance.

·     Simple architecture.

Pros:

 

·     Issues related to planning engine will have least impact on the transactional systems.

·    Supports different versions of EBS and VCP. EBS can be at the same or lower version than VCP.

·     VCP can be easily upgraded without any changes to EBS. VCP can be patched with a minimal impact on the EBS.

·     Ideal for implementation of multiple VCP modules.

·     Ideal for businesses with multiple plans running at different times.

·     Scaling up solution (such as adding new organizations, businesses) to the existing VCP instance is easy.

·     Ideal for businesses with multiple EBS instances which can be connected to a single VCP instance.

·     Can maintain multiple but not so powerful servers.

 

Cons:

 

·    Risk of facing issues related to memory.

·    Does not support different versions for planning and EBS.

·    Difficult to patch and upgrade. Upgrading VCP would be possible only when the entire instance is upgraded.

·     Limitation in terms of scalability of the solution.

·     Not ideal when multiple VCP modules have to be implemented.

·     Need to maintain huge and powerful servers.

 

Cons:

 

·     Higher investment in infrastructure and maintenance.

 

 

To conclude, a decentralized architecture is the most preferred and recommended architecture. Small organizations which could not afford multiple servers and the businesses with very limited and minimal planning requirements can choose OR start with a centralized implementation and move over to de-centralized architecture slowly.

 

For any inputs, clarifications and feedback, please feel free to write to me at mohan.chimakurthy@infosys.com. Our specialist team of VCP consultants will assist you in taking the right decisions at the right time.

December 17, 2016

Contract manufacturing and subcontracting practices: A propellant for tomorrow's world-class organizations

The dynamics in retail and consumer packed goods (CPG) industries has touched many aspects of the supply chain, and contract manufacturing is no exception to the rule. Industry players world over have leveraged this arena to its full potential, as each player focuses on its core competencies. 
Contract manufacturing can be defined as 'outsourcing of a requirement to manufacture a particular product or component to a third party.' It enables organizations to reduce the investments in their own manufacturing capabilities, helps them focus on their core competencies while retaining a high-quality product with a reasonable price, and delivered on a flexible schedule.

Continue reading " Contract manufacturing and subcontracting practices: A propellant for tomorrow's world-class organizations " »

November 4, 2016

Have you planned the testing of custom reports in your Upgrade Project?

In most organizations, there are custom reports designed in ERP to satisfy the reporting needs of business. Often, these custom reports are very similar to the standard out-of-the-box report with minor tweaks as per the business need. These tweaks can be the addition of a few extra columns, formatting changes, sorting changes, removal of unnecessary columns, and some additional calculations required for analysis and decision making. Sometimes, these tweaks are performed using the standard report design / logic.

When an upgrade is performed, custom reports need to be carefully tested for remediation and changes. For custom report testing, the below six-point strategy should be followed:

  1. Testing of reports for data before upgrade - To perform the testing of reports for data before the upgrade, one test instance of the existing production application should be built using the same data backup as of the upgraded test instance. Now the reports should be executed in the upgraded test instance (with remediated code) and the test instance of the existing version (with existing production code version).  Both the report outputs should be matched for same parameters. If the report outputs do not match, then that indicates an issue with the extraction of data before the upgrade. If the report outputs match, then it signals that the remediated code can provide reports as before. It should be ensured that this testing is performed before any data entry in both instances (i.e. upgraded instance and existing version instance). This will ensure that data in both instances are same and if there is any mismatch in report output, then it is an issue which needs to be fixed.

  2. Testing of reports for data after upgrade - To perform the testing of reports for data after the upgrade, new transactions should be entered in the upgraded test instance. After entering new transactions, the report should be executed, and its output should be thoroughly reviewed. It should be ensured that the report output is tested for all possible test cases and not only just data entry, for example, in a custom report related to payable invoice, the report output should be tested not only for a standard, but various types of invoices and statuses. For the post-upgrade data, testing should be planned along with transaction entry testing so that data created in transaction entry testing can be re-used for report testing.

  3. Report format - Post-upgrade report format should be verified with pre-upgrade report format. This verification will ensure that there is no formatting issue due to upgrades / changes / code remediation. This testing should be performed along with the testing of data before upgrade. With this approach, efforts for format testing will be reduced.

  4. Testing of data which was partially processed before upgrade - Report output should be tested with the processing of transactions which were initiated before upgrade. For example, in a custom report of payable trial balance, report output should be tested with those invoices which were created before the upgrade but are paid after the upgrade. This testing will ensure that partially processed data is shown correctly in reports.

  5. Report parameter - Report parameters should be tested thoroughly. Many times standard value set are used for a list of values in report parameters. When the upgrade is performed, then these values may get modified in new release. Also, when there is a custom value set which is table-based and if the table is obsolete in upgraded version, then that value set needs to be remediated. For any such items, the parameters of the report should be thoroughly tested.

  6. Report execution after go-live - A lot of reports are part of the standard period close package. These reports are executed only during and after the period close. After the upgrade, these reports should be executed before approaching period end. Alternatively, a mock period close is suggested to identify issues before approaching period end. This will ensure that if there is any issue in reports, then it is identified a lot early before approaching period end and can be fixed. With this, the period close timelines will not be affected adversely.

    In summary, it can be said that a good test strategy should contain detail section / plan for report testing to make the upgrade project successful.



Continue reading " Have you planned the testing of custom reports in your Upgrade Project? " »

September 17, 2016

Decoding GST for Oracle Customers

 

As India gets ready to implement the new GST law, the question on top of most of the IT leaders, finance leaders is regarding the readiness of the IT systems to meet the new requirements.

Are the changes to the system simple or complex? How long will it take to solution and implement the change?  Has this been done before? Can you speed up the process? When do we start with work to be ready on time? These are absolute valid concerns and need immediate attention.

Infosys has been working on building the solution to help enterprises move to new GST law smoothly and quickly. The Infosys solution

-          Uses the Infosys 'Tax Assessment' framework to understand the likely impact areas.

-          Uses the Infosys pre-built solutions, templates to solution the requirements

-          Uses the Infosys Intrak implementation approach to implement the solutions

Note, the solution has not considered Localization patches which Oracle may come up with. As of now, Oracle is still working on the localization patches. 

Assessment - Infosys Tax Assessment framework:

The Infosys Tax Assessment framework, has been built based on our experience in implementing tax solutions across the world for VAT, Sales tax regimes in Americas, EMEA, and APAC.  The framework ensures the tax impacts - easily and in a structured way without any misses.

The Infosys Tax Assessment framework, discusses the impacts within the below five boundaries.

1.       Master Data

2.       Tax rules ( defaulting tax on business transactions)

3.       Cutover Impacts

4.       Business documents

5.       Reporting and Accounting

The provision of 'Credits and Refunds' have also created a lot of confusion and anxiety.  The Infosys framework has been tailored to assess the likely systemic requirements around credits and refunds.

1.       Master Data -

Master data like supplier master, customer master, legal entity setups, General Ledger accounts and Part master needs to be enriched with the new tax registration details and exemption details to meet the GST law requirements.

GST Requirements on Registration:

As per the Model bill, the existing dealers would be automatically migrated. The new GSTIN will be a 15-digit GSTIN based on IT PAN.

Liability to get registered: Every supplier should be registered if aggregate turnover in a Financial year exceeds 0.9 million / 9 Lakhs INR (0.4 million / 4 Lakhs INR if the business is registered in North Eastern States and Sikkim).

 

Liability to pay tax:  will be after crossing the threshold of 0. 5 Million / 5 Lakhs INR for NE states and Sikkim and 1 Million / 10 Lakhs INR for Rest of India. Small dealers having sales below 5 million INR can also adopt the Composition scheme and pay flat of about 1 to 4% tax on turnover.

The tax is also determined based on the type of item, hence the parts should also be categorized using HSN Code.

2.       Tax Rules (defaulting tax on business transactions)

The tax rules default the tax rates on different transactions - P2P transactions and O2C transactions.  The Infosys 'Tax Assessment' framework helps building a tax matrix capturing all the tax rules in a single matrix, considering all the tax determining factors like party, place, product and process. The tax matrix ensures all tax requirements are correctly captured and are easily understood. Based on the tax matrix, the tax rules will be configured.  The tax rules will cover branch transfers and job work (OSP) transactions.

GST Requirements impacting tax rules:

·         GST is based on supply of goods or services against the present concept of tax on the manufacture of goods or on sale of goods or on provision of services.

·         GST will be Destination based tax against the present concept of origin based tax.

·         Local Transactions - will attract Dual GST With the Centre and the States simultaneously levying it on a common base

·         Interstate Transactions - will attract Integrated GST (IGST) would be levied on inter-State supply (including stock transfers)

·         Import Transactions - will attract IGST would be treated as inter-State supplies.

There are also likely to be multiple rate based on the type of item

·         Merit Rate

·         Standard Rate

·         De-Merit Rate

·         Zero rate taxes for certain items

 

3.       Cutover

The cutover from an old solution to a new solution is likely to impact the transactions which are mid-way in the end to end process. For example a PO created under an old tax regime might have old tax related data. When an invoice is created by matching the invoice to the PO, it might result in multiple taxes - one with old tax rates, statuses and the other with new tax rates, statuses.

The Infosys solution is able to identify the potential areas of impacts and leverage pre-built solutions to quickly identify and resolve such issues.

  

4.       Business Document -

Tax related information for e.g. tax registration details are usually printed on business documents like shipping documents, bill of lading, AR Invoices, purchase orders. Considering the refund / credit balance, the GST TIN of the buyer and seller should be printed on the AR invoices. The Infosys 'tax assessment framework' specifically poses questions around the business documents and invoices numbering. This is critical and is often missed, leading to penalties and non-compliance issues.

 

5.       Reporting and Accounting

The Infosys 'Tax Assessment Framework' finally looks at the reporting and the accounting requirements.  The monthly, quarterly, yearly, ad-hoc reporting requirements are captured as part of this step. The reports used for reconciliation with the general ledger and the number of GL accounts needed for reconciliation and reporting.  Companies may want separate accounts for Input IGST, Input SGST, Input CGST, Output IGST, Output SGST and Output CGST for easy reconciliation and credit tracking.

GST Requirements on reporting:

The Model GST Law proposes following reports

Monthly

Quarterly

Annual

Others

GSTR 1- Outward supplies

GSTR 4 - Quarterly return for compounding Taxpayer

GSTR 8 - Annual Return

GSTR 5 - Periodic return by Non-Resident Foreign Taxpayer (Last day of registration)

 

GSTR 2-Inward supplies received

 

 

ITC Ledger of taxpayer(Continuous)

 

GSTR 3-Monthly return

 

 

Cash Ledger of taxpayer(Continuous)

GSTR 6 - Return for Input Service Distributor (ISD)

 

 

Tax ledger of taxpayer(Continuous)

GSTR 7 - Return for Tax Deducted at Source

 

 

 

 

GST Requirements - Credits and Refunds

This is probably the most controversial change suggested by the Model GST Law. The credit claim process has been a topics of hot discussions as it could have big impact on the cash flow and even margins of the enterprises.

Below are the details of the credit and refund process.

Tax Credits to be Utilized as below

Conditions to Claim Credit

Timelines

Input CGST to be utilized against output CGST and IGST

Possession of tax invoice

One year from the invoice date

Input SGST to be utilized against output SGST and IGST

Receipt of the goods/ service

Credit pertaining to a financial year cannot be claimed after filing the return (for September) of the next financial year or the filing of the annual return for the year to which the credit pertains - whichever is earlier

Input IGST to be utilized against output IGST, CGST and SGST in the order of IGST, CGST and SGST

Payment of tax charged on the  invoice by supplier

 

 

Filing of GST return

 

 

Match the claimed credits with the vendor tax liability. In case of a difference / discrepancy, excess credits will be disallowed to the recipient.

 

 The above requirements are likely to lead to the following systemic requirements

·         A systematic way to automatically calculate the credits

·         A systematic way to do a vendor account reconciliation

·         The need to do a vendor reconciliation will need an ability to upload the vendor data from GSTN into Oracle.

·         A form to view the GST balance and ability to write-off credits which cannot be claimed

 

Solutioning using Infosys Accelerators -

We have a pre-built repository of ready-to-deploy solutions, which will help enterprises shorten the time to solution and then to develop the solutions. The solutions cover all the areas mentioned below

S Num

Item

Infosys Accelerator

1

Master Data

Re-usable solution to enrich master data

2

Tax rules ( defaulting tax on business transactions)

GST Tax Matrix, Pre-Built GST Configuration Templates

3

Cutover Impacts

Pre-identified components and pre-built solutions to correct cutover impacts.

4

Business documents

Re-usable solution to fix business documents

5

Reporting and Accounting

Pre-built reports, solutions to meet the reporting and accounting needs.

 

Refunds and Credits:

The Infosys solution for claiming refunds and credits will require developing the following programs and solutions to track credits, perform vendor reconciliation, claim credits and write-off credits.

·         Tracking Credits - A custom form will be developed to track the GST credits.

·         Vendor Reconciliation - Two custom programs will be built

1.       A custom program will be built using API provided by GSTN, to upload supplier data.

2.       Custom program will be built to automatically list the unreconciled items with reason code e.g. Goods in transit.

·         Claim Credits - A custom program will be built to automatically claim credits as per the GST rules

·         Write-off credits - The custom form to track credits, will include the ability to write-off credits.

 

Timelines:

The Infosys solution will enable enterprises to freeze the GST solution in 5-8 weeks, leveraging the Infosys 'Tax assessment framework' and pre-built solution repository.The likely timeline for the solution will be as below.

 

 

 

 

India GST Plan.PNGIn Conclusion:

Considering time frame-work of 1-2 months for solution finalization plus implementation effort of 2-4 months, it is prudent for organizations to start the work on GST immediately, to be ready for the 01-Apr-2017 launch.

 

 

 

 

 

September 12, 2016

Customer Delivery in Semiconductor Industry

Introduction

Semiconductor products finds application in various industry verticals that requires digital technology embedded in their end products to enrich them with the ability to program, configure, connect to other devices, automate or provide extra features to delight the customers.  Analog semiconductor application are also diverse such as in the field of power electronics, servo control application and energy.  Some examples are machine control units in automobiles, smart mobile phones, and control units in industrial automations, robotics, solar energy etc. 
With a wide range of application, semiconductor industry commands a business little more than $330 billion worldwide.  Supply chain in the semiconductor industry is characterized by the price sensitivity, constant innovation, product complexity, collaboration with manufacturing partners and globalization of its value chain.  The semiconductor product is not an end product by itself and is usually used to build one and hence its supply has a significant impact to the business downstream.  Low margin in this business requires higher volume turnover for them to be in green.   Competition and the above challenges forces the industry to emphasize on the customer delivery performance and service levels to their clients, who are from diverse segments in the market. They have to add value not only to the technology through innovation but also to the supply chain process by reduced cost along with reduced reaction time to demand.


  Customer delivery function is identified as the priority one processes to sustain in the semiconductor business.  It involves order capturing, calculating demand, delivery promising, tracking logistics till ownership transfer to customer and keeping customer informed.  This is supported by the planning function to ensure service levels that reduces or prevent line down situations for their clients and maintain optimum inventory.  The key performance elements of the customer delivery that depends on the supply chain are -
a. Reaction time to the demand.  The ability to respond with promises or exceptions to the client.
b. Ability to meet the customer demand as per the schedule with negligible or no slippages
c. Real time status information availability.  This is the ease of accessing status of inventory in terms of location, reservation and quantity.





Continue reading " Customer Delivery in Semiconductor Industry " »

September 8, 2016

Internet of Things (IoT) in field service management (FSM)

In today's competitive world, real-time data and innovative service methods are vital for field service enterprises to ensure customer delight, increase revenues, and expand profit margins.

The IoT explained

The Internet of Things (IoT) allows machines to communicate with each other (M2M communication). It is built using a combination of networks that comprise of data-gathering sensors, devices, big data, analytics, and cloud computing, which communicate via secured and encrypted channels. Connected devices enable efficient predictive maintenance by constantly providing information on a machine's performance, environmental conditions, and the possibility of failures. IoT can connect machines on the field in order to record incidents in real-time into a semi-intelligent 'Gen-X' FSM system.

Integrating IoT with FSM software applications

Field service organizations always strive to consistently provide the best service experience to their customers, by ensuring immediate repair and maintenance of their equipment and machinery. By collecting data about the machine's health and performance from IoT sensors, organizations can leverage predictive and preventive field service to minimize device downtime.


Three primary traditional FSM challenges

Here are three primary issues that challenge the current reactive scenarios:

    Field technicians execute the job and fix the equipment after the issue is reported. However, the delay can impact business continuity, which in turn affects the operating profit margins


    Adding more field technicians and service trucks to the field comes at a cost and sometimes the increased capacity remains under-used


    Assigning more work to existing field teams can have a negative impact on SLAs and first-time fix rates. Even worse, it can increase the cost of travel and overtime

Essentials of a new-age FSM solution

A field service management system that integrates device sensor data, technicians, customers, and technology is the key to address these issues. It should function in a predictive and preventive mode with the following features:

    The FSM process, which includes issue identification, communication, incident creation, scheduling, and assignment can be automated, thereby ensuring zero disruption in machinery operations and no or negligible downtime. This not only increases productivity, but also expands operating profit margins

 

    Most FSM products can also automate incident creation, scheduling, assignment, and invoicing processes. Using IoT, we can predict upcoming issues based on sensors data analysis and auto-creation of incidents based on preset threshold rules

The workflow of a FSM system with IoT integration

Here is an outline of the flow of incidents in a typical IoT-enabled FSM system:

1.   Data from the equipment's sensors is collected and transmitted, using secured and encrypted channels, to a big data storage


2.   Big data management and analytics is used to parse and analyze for refined sensors data


3.   The IoT command console is configured with predefined threshold rules to identify errors and monitor the device's health and performance


4.   Incidents are auto-created in the FSM system whenever errors are detected


5.   Auto-scheduling, routing, and dispatching of field service technicians against the incidents is done based on customer entitlements, location, product, skills required for the job, technician's availability, parts availability, etc. via the FSM system


6.   A field technician performs the job at the customer's site; records the effort, parts used, travel time, and any expenses incurred; and then bills the customer


Workflow of Field Service Management application using IoT.

Six Solution benefits



Wind turbines: A case in point of how IoT integrates with FS systems

Failures in wind turbines interrupt power generation leading to lower productivity and higher system downtime, which result in varying energy production and higher operating costs. To maintain profit margins, higher efficiency and uptime are required.

Near-real-time analytics provides data so that FS teams can react faster and address the issues before they become mission critical, thus reducing impact and avoiding downtime.

The wind turbine's sensors collect real-time data that is analyzed and through which, auto incidents are created, service scheduled, and an agent assigned to fix the issues. Wind turbine sensors are also used to continuously collect operating temperature, rotor acceleration, wind speed and direction, and blade vibrations - all of which can be used to optimize the turbine's performance, increase its productivity, and execute predictive maintenance to ensure reduced downtime.


*** Authors: Haresh Sreenivasa and R.N.Sarath Babu **


Continue reading " Internet of Things (IoT) in field service management (FSM) " »

August 26, 2016

Optimizing Supply Chain Inventory Using Oracle Inventory Optimization

 

In an increasingly competitive and globalized world every organization has to attempt novel methods to stay ahead of the competitors. Enterprises constantly strive towards improving their revenue, profitability and operating margins. It is no more possible for the enterprises to record a positive Year or Year (YoY) growth just by increasing the sales volume and thereby increasing the revenue and profit.  Most of the successful enterprises today have started looking within rather than outward to achieve their growth targets. The focus is on reducing the inventory (safety stocks), carrying and operating costs to improve the profitability without having to impact the productivity. The key to success is to optimize the overall supply chain inventory which reduces the cost of inventory and carrying costs eventually reducing the overall operating costs and contributing to improved margins.

The biggest challenge that looms over the inventory managers in large enterprises is how much inventory we should carry such that we do not compromise on the customer service level. In a global enterprise spanning across multiple geographies with multi-level and multi-layer supply chains, it is not an easy job to decide upon the ideal stocking locations and stocking strategies. With increasing number of competitors retaining the loyalty of the customer is increasingly difficult which leads to high demand variability and forecast inaccuracies. The variability in the lead times committed by our suppliers, transportation contractors and our own production engineers due to the unforeseen events, adds fuel to the fire. Given the circumstances and complexities the use of an IT tool is inevitable. Oracle Inventory Optimization is one amongst the tools available which could assist the enterprise managers in formulating and executing their inventory stocking strategies.

Oracle Inventory Optimization is part of the comprehensive Value Chain Planning Suite of applications from Oracle. The module provides a seamless integration with oracle e-Business suite transaction modules to get a snapshot of the supply chain and master data setups. It also integrates other supply and demand planning modules in VCP for further planning. IO provides the businesses with time-phased safety stock figures under the complex supply chain network.

The key advantage of IO is that it does a multi echelon inventory planning there by optimizing the inventory in the entire supply chain network as a whole in contrast to the conventional inventory planning techniques/tools which does a local optimization of the inventory. Businesses can now plan their entire supply chain network in a single plan. Along with the flexibility in fulfilment lead times and in-transit lead times between various levels of the supply chain network, IO recommends ideal stocking location of the inventory through postponement. Based on the supply chain network, it attempts to pool the risk of variability at higher levels in supply chain to a level lower in the supply chain network which would considerably lower inventory levels and costs without affecting the service level targets.

IO takes into account not just the demand variability and the forecast inaccuracies but also accounts for the variability of your manufacturing, in-transit and supplier lead times. It provides an insight on the contribution of each of those variability towards the overall proposed safety inventory levels.

Illustration 1: Time-phased safety stock analysis in analysis workbench

IO allows the users to perform different inventory simulations with different business objectives such as target service levels, budgets for different category/class of items for different customers/geographies. Inventory planners can perform different what-if scenarios and compare the outcomes related to target safety stock levels, budgets, inventory costs etc in Analysis workbench. The workbench provides the comparisons in both tabular and graphical formats with different types of graphs which are easy to interpret. The users can perform budget, cost break down, profitability analyses along with the safety stock and postponement analysis using the analysis workbench.

Illustration 2: Bar chart comparing safety stock recommendations for different IO Plans

 

Illustration 3: Line chart comparing safety stock recommendations for different IO Plans

 

Once the planners have arrived at ideal safety stocks in line with business objective, this information can be input to Advanced Supply Chain Planning (ASCP) for supply planning.

To conclude, Oracle Inventory Optimization is a very handy tool to enterprise managers which acts both as a strategic tool to decide upon the inventory stocking locations and as a tactical/operational tool once the strategy is formed. Its seamless integration with other Oracle demand and supply planning tools make it easy to implement and use.


January 21, 2016

Supply Constraint Impacting Customer Order Promising in Outsourced Manufacturing

Continuing from my previous blog, upon delving deeper into the supply constraint aspect, we come upon certain aspects of the high tech industry that prima face indicate to concerns regarding lag or mismatch of information, genuine loss of data (systemic and manual) which I will touch upon briefly.

1) Due to the inherent structure of an outsourced high tech environment, there is no single source of truth (SSOT) for supply data. Each partner/vendor maintains his supply information in different ways. Horizons may vary. Some maintain gross values while others go with Net. Some get hourly updates from suppliers while others may work on the daily model. Consumption patterns may reflect differently. Given these disparities, it is evident that there will always be discrepancies between the actual supply and what is reflected in the system of OEM

2) Down the supply chain, the allocation of these supplies to different classes of customers leads to allocation issues. Some customers cannot be made to wait. Systemic segregation rules may not always allocate enough for such contingencies. In other cases, we are left with excess supply for priority customers while lower priority customers are made to wait. Maintaining a fine balance between customer satisfaction and idle inventory becomes more art but science is quite applicable to an extent.

3) ECO transitions affect supply picture across the supply chain. Variants of a component which can be practically used as alternates but maintained differently lead to a skewed picture around the time frame of transition. Systems may not be able to identify the interchangeable nature of components leading to a higher promise date for the customer while the technician on the shop floor knows he has enough to fulfill the demand on time. 

4) When any out of system corrections happen to get around the systemic issues, it further aggravates the problem of supply accuracy. For this reason, it becomes imperative to build systemic checks to ensure any system reflects the ground reality accurately.

Join our session "Reliable and Accurate Customer Promising" to understand how a major High-tech OEM deals with such issues with a combination of process constraints to rein in time lag variations to keep it consistent for all partners (by means of EDI cut off timings) and systemic interventions to combine supplies for different versions of components for planning purposes. In addition, consistent monitoring and manual corrections are done to keep the supply picture as current as possible.


January 20, 2016

Inventory Accuracy - Control Negative Inventory

Negative inventory occurs when over consumption/issue happens from a location as compared to its actual on-hand quantity. This can happen because of many reasons like - Error in Bills of Material Maintenance, over reporting of Scrap and Production Quantities, wrong UOM Conversions, Delayed Transfer to transacting location, wrong Counting Transactions etc. Although it is a temporary phenomenon, its impact is more significant mainly in Planning and Inventory valuation reporting. Negative inventory acts as a demand for planning engine resulting in inaccurate inventory value statement.


For a Manufacturing Organization, not allowing negative transactions can stall the Production Line, even though material is physically available, thereby impacting Finished Good Reporting, Schedule Attainment, Line stoppage and Customer order fulfillment etc. On the other hand if we allow negative, we may hide many pertinent problems which will remain uncaught and then passing the onus to cycle counter/inventory controller to run around in finding the reason and fixing it. It gets more complex when organization has complex Product Structure and complex warehousing operations.


But, Yes - I intend to say that maintaining Inventory accuracy is as important as ensuring uninterrupted production. Since Oracle does not have a solution to allow Negative for some items (ex. Low value items), so we are left with following options, and mixed mode approach is one of the best suited solution for manufacturing organizations. We should weigh each option and assess organization's preparedness adopt on of these.


Option

Pros

Cons

Do not Allow Negative

Root Cause can be analyzed and fixed

 

Accurate and reliable Inventory Reporting and Planning

Can potentially stop Production Line and affect schedule attainment KPIs.

 

Root Cause Analysis may take longer time and can cause significant disruption to the Production Line

Allow Negative

No Interruption of production as back flushing can drive inventory negative indefinitely

No detection and control over the erroneous transactions or erroneous data.

 

Difficult to detect the root cause as the time progresses.

 

Data is not reliable for Inventory reporting and planning.

Mixed Mode

(Allow Negative During Back flushing)

Production can go smooth till inventory goes negative at the Organization level. i.e Back flushing happens from the Lineside sub inventory till total on-hand at organization level goes negative.

 

Inventory reporting and planning at the organization level is accurate and reliable.

 

In a case where back flush is not allowed because of the total inventory going negative, root cause analysis should be done quickly. But this option will narrow down the root cause analysis.


Contact

In case you have additional questions or need more detailed discussions, please drop me a note @ Saroja_Nayak@infosys.com

Connect with Infosys
Infosys is a gold sponsor this year at Modern Supply Chain Event. You can listen to Infosys clients discuss industry leading best practices in the manufacturing track panel discussions. Discuss with our thought leaders on how Infosys can help you realize measureable business value from your supply chain management investments.
Join us at booth #410 to learn about our cutting-edge offerings and supply chain management solutions.

January 14, 2016

Integrated Supply Chain with Contract Manufacturer

In today's competitive world, Manufacturing Organizations subcontract many of their key processes to gain competitive advantage in operational efficiency, operating cost, capital expenditure and several other benefits. Comparatively, manufacturing organizations need a tighter bond with its outsourcing partner(s) at various stages of its manufacturing cycle. Today's supply chain processes demand an end-to-end integrated & interactive view to enhance collaboration with subcontracting business partners. Current outside processing functionality is rather insufficient to solution this gap.
Oracle's Outside Processing Functionality is not sufficient enough for end-to-end system driven transactions and visibility with varied business processes and complex subcontracting supply chain. Then you have Oracle Outsourced Manufacturing - a complete integrated supply chain module. It is integrated to core manufacturing, planning and distribution modules, it provides the required visibility, traceability and scalability to implement many variants of subcontracting processes.

Overview of the Outsourced Manufacturing Transactions:


Integrated Planning and Execution:


Contact

In case you have additional questions or need more detailed discussions, please drop me a note @ Saroja_Nayak@infosys.com

Connect with Infosys

Infosys is a gold sponsor this year at Modern Supply Chain Event. You can listen to Infosys clients discuss industry leading best practices in the manufacturing track panel discussions. Discuss with our thought leaders on how Infosys can help you realize measureable business value from your supply chain management investments.
Join us at booth #410 to learn about our cutting-edge offerings and supply chain management solutions.

Importance of Accurate Customer Order Promising in Outsourced Manufacturing

Driven by strategic importance and strength of internal capability, industry leading Original Equipment Manufacturers (OEM) often outsource sourcing, manufacturing and logistics functions to partners but retain supply chain planning and scheduling/ATP largely in-house. While outsourcing confers sharper focus and reduced costs in general, outsourced Supply Chain Planning has many high-impact risks for OEMs like

•    Loss of competitive advantage
•    Reduced quality of planning output
•    Coordination and synchronization miss
•    Limited supply chain flexibility
•    Slower responsiveness to customers

With in-house planning, the Available to Promise (ATP) response cannot be made instant but at best with a lag due to the fact that there would always be a difference in supply statement in the system when compared to partner sites. Companies especially in High-Tech industries who have Build to Stock, Build to Order and Configure to Order products often promise their customer within the target lead time goal across different product families. With lighter configuration, shorter lead time, high volume and perishable demand, planners and backlog management functions are under tremendous pressure to promise customers within the goal to meet revenue targets. With companies scheduling orders too conservatively or incorrectly and later having to re-schedule to be able to ship within the goal would impact following performance metrics

1.    Scheduling against target lead time: Ability to provide a promise date to the customer with in the lead time goal from the order entry date
2.    Scheduling touches: Backlog management activities to pull-in or push-out dates from initial scheduling
3.    Customer satisfaction: Customer dissatisfaction due to inability to provide accurate initial promise date



Impact to performance metrics has a direct bearing on the business not limited to:

1.    Customer dissatisfaction : Multiple touches and repeated date changes develop nervousness and loss of trust in customer
2.    Unpredictable sales revenue: As promising dates extend beyond the goal, revenue targets get impacted and increases backlog management activities.  
3.    Excess inventory in channels: As supply planning is based on the revenue targets, excess inventory lies in the channels as the scheduling attainment rate is lower than anticipated
4.    Increased number of touches: As companies try to improve revenue targets by pulling in the orders, it increases the touches
5.    Increased support cost: OEMs have to pay additional charges to Manufacturing Partners, B2B partners for change in assembly and shipping completion date

There are many reasons why companies in outsourced manufacturing find it challenging to attain high promising accuracy and it can broadly be classified into three

1.    Supply Constraint: Inaccurate supply picture, B2B delays, incorrect allocations, supply lost between engineering transitions etc.
2.    Lead time Constraint: Conservative transit lead time across geographies, incorrect application of holidays, and extended lead time on components etc.
3.    Configurations: Incorrect lead time, calendar, sourcing rule and other item related attribute set-up's


 

In my next blog, I'll explain how these causes impact scheduling performance and what approach one should take to address it. Please join the session "Reliable and Accurate Customer Promising" session at 4:30 PM on 1/26 in Oracle Modern Supply Chain Experience conference at San Jose to learn how a leading High-Tech complex OEM improved its scheduling performance from 55 to 80%


http://www.oraclemsce.com/solution/supply-chain-planning  (Executive Ballroom 210 CG)

August 11, 2015

Product Data Security with Oracle Product Information Management (PIM)

In today's Collaborative Market, Lots of data need to be shared across supply chain stakeholder. New edge technologies are giving platform to share product data across supply chain. Quality data and Security feature always play vital role in success of Supply Chain Management. However when we share data with external or internal stake holder in Organization then there is always Data Security issue.

In PLM world there are some security feature. However in Oracle ERP-Inventory module we lost this security feature. I worked in Process Manufacturing/ Chemical Manufacturing due to Specialty Chemical we don't want to share our Product information with other facilities to avoid business risk/Data Security issues. It's always challenging to restrict access of user who creates Master Product Data in ERP side. Hence we have to use some custom approach or to use different item codes which add more complexity in our ERP solution.  

Oracle has Product Information Management Module is mostly used as MDM tool. However in one of our implementation we used it as add on Oracle EBusiness Module. PIM has lots of functionality but we will talk only about security feature in this blog.  

PIM provides Data security feature at various level

1.   Organization level - We can restrict data access of Group/ Users based Organization. 

2.   Item Catalog category level - Item catalog category (ICC) is one level of Grouping of Items. We can have different ICC in Business like in Computer business, we can have ICC like Desktop or Laptop or Hard disk, Key board, Mouse. Under this ICC we can assign different items which are relevant to ICC. One item can have only one ICC assigned. PIM use this ICC feature as controller in Most of other Functionality. We can restrict user access based on ICC. Hence we can restrict user access under one organization based on ICC.

3.   Item Level - We can restrict access at Item level.

We can achieve Role based Data Security based by using below feature.

1.   Role & Privilege - Privileges determines data access to user like Create/Edit/ View. A role is collection of Privileges. Business roles can be defined in PIM like Product Engineer, Design Engineer and Application Engineer etc. We can provide Privileges based on Role like Product Engineer will have Item creation Privilege, Design Engineer will have Item Update privileges and Application Engineer will have only Item View Privileges. Hence this way we can restrict Product access based on Role.

2.   Group -We can group multiple Employee/ Person under PIM group. Role and Privileges can be assigned to Group. Group can be defined like Business departments e.g. Planning, Sales, Engineering.  This is optional setup feature and we can assign Role to Person also.

This security feature can be effective to external stakeholders like Supplier and customer. Combination of Role and Group or Person, we can restrict access of User to certain functionality like Item creation access only, View Access only. Like shown below example.

 View image

In our Example we have Chemical Mfg Company and it has two product Group 1. Paint Based products 2. Oil based Products. Company Manufacturing in Three plants so if we have to restrict access of user based on Product type and Location/Plant then we can do it as mentioned below.

View image

Leveraging PIM Role based Data Security, we can overcome Supply chain Data security concern.  Role and Group configuration can be used for other processes or workflow like Change order and Change management.  

 

PIM is one of best in class Mater Data Management tool. We can integrate various system and publish product data to various system with various best features. Therefore Existing customer or Future customer can implement PIM as additional Module or MDM tool in Ebusiness suit to use its various functionality. I think Process Manufacturing industry should consider PIM for their Data Management Requirements also.  We will discuss about other virtuous feature of PIM in next blog.

January 14, 2015

Hyperion Data Integration Tools - A Comparison

Continue reading " Hyperion Data Integration Tools - A Comparison " »

February 5, 2014

Oracle Project Manufacturing - A complete solution for Project Manufacturing based industries (Project Procurement)

Guest Post by 
Srushti Gogate, Senior Associate Consultant, Infosys Limited

Now that we have an introduction to Oracle Project Manufacturing (PJM) in the earlier blog and its salient features, we will look at each feature in detail in the forthcoming blogs. In this one, we shall throw some light on project based procurement using Oracle Purchasing and iProcurement.
Procurement on a very basic level can be of 2 types; either purchase of raw material or services provided. To talk in Oracle terminology it can be of destination type 'inventory' or 'expense'. The behavior or method followed in both cases is completely different in Oracle. Whenever we have an expense Purchase Order (referred to as PO further) raised against a project, Oracle Project Costing modules comes into picture. However this modules does not allow processing of inventory POs against project which is a standard functionality provided by Oracle. This is where PJM has a critical role to play.

Continue reading " Oracle Project Manufacturing - A complete solution for Project Manufacturing based industries (Project Procurement) " »

November 11, 2013

Oracle Projects and Labor costing

Guest post by Somnath Pansare, Principal Consultant, Infosys Limited

Compared to Oracle E-Business suite financial family of Oracle applications, Oracle projects family modules are not that extensively used. Recently many customers realized the potential of Oracle Projects suit and started using for their business. Now, lot of unused functionality is being explored and additional requirement or tuning of functionality is coming from these areas, labor costing is one of the areas.

Continue reading " Oracle Projects and Labor costing " »

October 25, 2013

Face of the CRM Apps (Part-2)

Guest post by
Richa Thakur, Senior Associate Consultant, Infosys

 

CRM landscape is an uneven terrain, with various players battling it out in the global markets. It is challenging for the organizations to differentiate their offerings from the competitors. Likewise, the customers feel lost in the wide array of options that are offered to them. We present the next two trends which we feel could be the differentiators in the organizational offerings.

Continue reading " Face of the CRM Apps (Part-2) " »

October 23, 2013

Leading the Customer Service Commitments (Part-1)

Guest post by
Saurabh Dwivedi, Consultant, Infosys

 

One fine morning, I got a call from car service center reminding me of due car service.  I was aware of the due service but appreciated this initiative. I checked with the service agent for the availability of dashboard glove box which my 3 year old champ has accidentally pulled off. The line went quite for some time and then service agent responded that the parts are available and I can visit for service and also get it replaced same time.

Expecting to get the part fixed along with the due service, I got an appointment for weekend, travelled to another corner of the city and finally landed up at the center. I was received well but to my shock and surprise when I checked for the part, the assigned technician updated that the particular part is out of stock. The whole customer experience journey came crashing down upon hearing this. Since, I had travelled to another corner of city, I didn't have many options. I got my service done and returned with bad experience.

Continue reading " Leading the Customer Service Commitments (Part-1) " »

October 22, 2013

Face of the CRM Apps (Part-1)

Guest post by
Poonam Gujral Nagra, Consultant, Infosys

 

Surpassing the Cut throat competition and keeping pace with innovative CRM trends all lead to a common point of bonding with the customers like never before by gathering information about the customers visualization for Service Reliability, Usage, Enhancement and Scalability. The next generations CRM Applications will change the way enterprise do business. The organizations are trying to decode the future of the next gen CRM Applications.

We have identified five trends in CRM apps space through which the enterprises can win more customers and effectively manage the relationship with the existing customers.

Continue reading " Face of the CRM Apps (Part-1) " »

October 17, 2013

A smart solution to safe travel: Is connected vehicle an answer? (Part-2)

Guest post by
Ashish Verma, Senior Consultant, Infosys

 

Picking up from my previous blog on "Safe Travel" where we discussed about possible ways to enable safe travel and a possible solution. Now we know our expectations from an intelligent mode of travel. In nutshell, four types of data are important for safe travel.

  • Driver's Data
  • Infrastructure data
  • Vehicle data
  • Historical data

Continue reading " A smart solution to safe travel: Is connected vehicle an answer? (Part-2) " »

October 16, 2013

A smart solution to safe travel: Is connected vehicle an answer? (Part-1)

Guest post by
Ashish Verma, Senior Consultant, Infosys

 

I heard this news a few days ago about a train accident in Spain and the cause of it was its high speed. Train was travelling at 190km/hour instead of 80km/hour. The first thought that hit my mind if there was a way to avoid it. Was there a way to signal or alert the driver that the train has exceeded the permissible speed limit? Could we have reduced the speed of the train when that deadly turn derailed it? This is just one example, every day we keep hearing about such accidents around us. And what happens after such accidents? An enquiry commission is appointed which ultimately links the accident with one of the possible causes like over-speeding, signal violation, driving under intoxication, negligence etc. But do such commissions work with authorities to ensure that such mishaps will not be repeated in future or how travel can be made safe for all?

Continue reading " A smart solution to safe travel: Is connected vehicle an answer? (Part-1) " »

October 15, 2013

Managed Cloud Services for Siebel CRM

Guest post by
Manmeet Mehta, Technology Lead, Infosys

 

Change is the law of nature .During hot summer when temperature break all the records, people look up in the sky to see if they can see some Clouds, that would provide them some relief from this heat and welcome the spring season open heartedly.  Organizations are also looking for respite from the heat of the servers they have to maintain. And Cloud computing have come as a respite for them. The latest trend of moving from On Premise systems to On Cloud is a great example of Green IT.  It suffices use of computing resources in an efficient, effective and economic way. Cloud computing usage in the enterprise is growing rapidly, all contributing to winning business for organizations.

Continue reading " Managed Cloud Services for Siebel CRM " »

October 10, 2013

How best to model profitability for financial institutions? (Part-3)

Guest post by
Vandana Vasudev Nayak, Consultant, Infosys

 

Financial institutions, today, look for enterprise "tool" that captures cost study results, manages cost drivers and analysis without manual intervention that can be leveraged for advanced analytics and Portfolio Management and Strategic Planning.

In the previous part, a comparative analysis of two applications in the space of profitability management from Oracle - Hyperion Profitability Cost Management (HPCM) and Oracle Financial Services Profitability Management (OFSPM) were evaluated on business criteria. In this part, the focus is to evaluate the products for key usage criteria.

Continue reading " How best to model profitability for financial institutions? (Part-3) " »

October 7, 2013

Using File Based Loader to Extract Maximum Value

Guest post by
Prashant Sharma, Project Manager, Infosys

 

With the Cloud buzz picking pace, lot of organizations are moving to Software-as-a-Service (SaaS) based applications. ERP implementations that would on an average take close to a year, are now being executed in few weeks. Data conversion, as always, is one of the key activities in these implementation projects. Earlier, technical and functional teams used to spend many person months developing data conversion programs and utilities, as part of the project life cycle. This is not the case anymore with Oracle Fusion SaaS implementations. File Based Loader (FBL) is a very effective data upload tool, which guides customers in their implementation journey to Oracle Fusion applications.

Continue reading " Using File Based Loader to Extract Maximum Value " »

How best to model profitability for financial institutions? (Part-2)

Guest post by
Vandana Vasudev Nayak, Consultant, Infosys

 

In continuation of our view on profitability measurement being one of the critical activities for Financial Institution to survive in today's volatile environment, let's focus on technology to enable better profitability management. 

Continue reading " How best to model profitability for financial institutions? (Part-2) " »

September 25, 2013

First Financial Period Closure Post Go-Live for a large transformation ERP Program (Part-2)

Guest post by
Hemantkumar Nathu Lothe, Senior Consultant, Infosys

 

In the last blog, we have seen two important points to be considered for successful first period closure which are-

1. Period End Closure Mock
2. Period Closure with User Entered Data in UAT

In this blog, we'll take a look at other three recommendations.

Continue reading " First Financial Period Closure Post Go-Live for a large transformation ERP Program (Part-2) " »

September 13, 2013

Embedding Risk factors to build Strategies for Higher Performance of Financial Services

Guest post by
Vandana Vasudev Nayak, Consultant, Infosys

 

In today's global financial markets, financial institutions are hard pressed to monitor and analyze performance data from multiple sources.  More essentially, the challenge lies in generating a combined view of risk and performance. The outlook of Financial Institutions in evaluation of performance is changing; both financial and non-financial aspects like risk are in focus. Financial Institutions are looking for a strong basis and approach that can drive performance smoothly with due consideration to associated risk factors.

Continue reading " Embedding Risk factors to build Strategies for Higher Performance of Financial Services " »

September 12, 2013

External Transactions and Automatic Bank Reconciliation

Guest post by
Kiran A. Mathew, Senior Associate Consultant, Infosys

 

Lack of internal controls on reconciliation and estimation cash in your organization can lead to various issues like:

  • Lack of holistic view of cash position which impair management decisions
  • Misappropriation of Cash and other fraudulent activities.

Organizations always resort to bring in the internal controls in cash position analysis and reconciliation process to address these issues.

Continue reading " External Transactions and Automatic Bank Reconciliation " »

September 11, 2013

Solutions to curb market volatility and support the falling currency

Guest post by
Surabhi Shah, Consultant, Infosys

 

With the fluctuating market trends and growing need for more foreign borrowings has changed the face of Indian economy significantly. Tracing the genesis back of the rupee-dollar relationship, rupee's journey has taken several folds since 2012.  The year 2012-13 has been a roller-coaster ride for Indian Rupee with rupee depreciating all time low to 68.80. As a stepping stone, RBI came up with continuous measures to tighten the liquidity in the economy and to support the depreciating currency. With India being a developing economy and sky touching inflation, the depreciation of currency was quite evident. However, to curb the scenario, improvement in local macro economic factors is the most fundamental variable to sustain appreciation of Indian currency and economy growth in medium term.

Continue reading " Solutions to curb market volatility and support the falling currency " »