Infosys’ blog on industry solutions, trends, business process transformation and global implementation in Oracle.

« July 2017 | Main | September 2017 »

August 30, 2017

Instance Planning/deployment options for migrating Oracle E-Business Suite and other packaged applications on to Oracle Cloud (IaaS and PaaS)


The advent of cloud has thrown open a wide array of options for deployment of applications and databases. Coming to Oracle cloud, it has thrown open so many additional options and so many different permutations and combinations that it is easy to choose sub-optimal options and end up with increased costs, maintenance and lack of control. This blog will try to clear the air around the various options that are available in Oracle Cloud (IaaS and PaaS) for deployment of packaged Oracle E-Business Suite (EBS) application and database. A similar yardstick may be used for deployment models for other Oracle packaged applications like JDE, PeopleSoft, Siebel, etc. which need to deployed using IaaS/PaaS.

 

The Basics:

The basic building blocks of any packaged application (or any application for that matter) are the application tier, database tier (utilizing CPU, Memory, Storage supported by OS) and the network connectivity (within and outside the tiers). The architecture given below depicts a very simple deployment of Oracle EBS with a single node (server), one each for application and database making it a 3 tier deployment. Technically speaking it is possible to have the application and database tiers deployed on a single node as well but for representation purposes, these have been considered separate. Pl. note that individual tiers (DB and App can be spread across multiple nodes as well).

Fig 1

Available Options in Cloud:

To summarize, Oracle IaaS has below available options (from https://cloud.oracle.com):

Compute Shapes

RAC for HA not available - more suitable for application tier and DR

 

General Purpose (Oracle Public Cloud - OPC*)

 

General Purpose

High Memory Shapes

High IO Shapes

GPU Shapes

CPU and Memory Range

1-32 OCPU's

7.5-240 GB

1-32 OCPU's

15-480 GB

1-16 OCPU's

15-240 GB

*used for specialized scenario hence not discussed here

 

Dedicated Compute

 

Model 500, 1000, 1500, 2000

SPARC Model 300

CPU and Memory Range

500-2000 OCPU's*

*(assume 7.5 GB to 15GB Memory available per OCPU)

300 OCPU's (SPARC)*

 

 

Bare Metal

More robust and performance intensive and with flexibility of bringing your own VM but with usage commitment of 12 month period for pre-determined committed monthly usage amount.

 

Compute-Bare Metal

Compute-Virtual Machine

CPU and Memory Range

36 OCPU's

256-512 GB

(with High and Dense I/O options for storage)

1-16 OCPU's

7-240 GB

(with Block and Dense I/O options for storage)

 

Database Cloud Service (DBCS with high or dense I/O capacity)

DBCS with 2 node RAC

CPU and Memory Range

2-36 OCPU's

512 GB

(with High and Dense I/O options for storage)

4-72 OCPU's

2 DB nodes with 512 GB each

(with raw SSD storage)

 

 

 

DBaaS

Comes with and without RAC for HA. DB software license needs to be purchased (BYOL cannot be used)

 

High Performance Package (HPP*)

Extreme Performance Package

CPU and Memory Range

1-32 OCPU's

7.5-240 GB (General purpose) and 15-480 GB (High Memory)

Same as HPP* except that it includes RAC, Active Guard and In memory options.

 

 

 

Exadata on Cloud

Has all latest Oracle DB features along with highly performant storage (BYOL cannot be used)

 

Oracle Database Exadata Cloud Machine (ExaCM)

Oracle Database Exadata Cloud Service (ExaCS)

CPU and Memory Range

16-248 OCPU's and 720 GB per DB node. Can be scaled up to 8 DB nodes with 336TB usable storage. Hosted in customer Datacenter

Same as ExaCM* except that this is hosted in Oracle Datacenter

 

Please note that there are other options also like standard edition, enterprise edition, schema service, standard package, etc. but these have not been mentioned as these are not suitable for packaged applications like EBS.

Typical requirements:

Customers may typically seek below options during various phases/cycles of EBS implementation/development viz., DEV, PATCH, STAGE, GOLD, UAT, SIT, CRP, Pre-PROD, PROD, DR, etc.

Fig 2

How it all fits:

As can be seen from the above matrix (Fig 2.), DB HA (i.e., RAC) is typically not something which would be needed until the SIT instance which means that DBCS is not needed until SIT, which in turn means that all instances prior to SIT could be potentially be built "on demand" on Oracle public cloud (using general purpose shapes like high memory and high I/O shapes).

Building on the above data and 'typical requirements', below matrix shows which EBS implementation phase can typically utilize what type of cloud infrastructure deployment options. E.g., if migration of existing EBS infrastructure needs to be done on to Oracle cloud (IaaS + PaaS) (this assumes that current DB and app licenses will be moved on to cloud), below deployment matrix can be used.

Typical deployment options for moving Oracle EBS to Oracle cloud (with already existing implementations)

In case it is a new deployment altogether (where no previous EBS implementation exists) the above matrix will change as below. Note the usage of DBCS and ExaCS which includes DB software license as well.

Typical deployment options for implementing Oracle EBS on Oracle cloud (green field implementations)

*If HA is not a requirement (which should not be the case except in case of very small enterprise architectures) then DBCS/ExaCS may be replaced with OPC/Bare Metal as needed.

There will be cases where customers with multiple instances of Oracle EBS/packaged applications might look to migrate their entire set of instances to cloud (the number of instances might range from 20 to a few hundred utilizing more than 100 CPU's and more than 500GB of memory). In such cases a more optimal solution might be to go with options like dedicated compute (DC), Bare Metal (BM) and/or ExaCS which offer homogeneity and simplicity of architecture and maintenance. Another case of large deployments might be when customer is moving their entire data center infrastructure to Oracle cloud with multiple instances of Oracle EBS and other applications. Such deployments also would benefit from DC, BM or ExaCS options. Below matrix demonstrates the same.

Typical deployment options for large Oracle EBS implementations on Oracle cloud

Final Words:

The above options for deployment are 'typical ones' which one would encounter and can be used as a standard from which tailored/custom deployment options can be picked. E.g., Oracle cloud also offer private cloud at customer data center which might be useful for cases where regulations prevent customers from hosting outside their own premises/region/country.

From the above description, it is evident that unless an enterprise is completely moving its data center to cloud (and/or has a usage requirement of more than 100 CPU's and 500GB memory) it makes sense to leverage the various features and options like OPC/Bare Metal/DBCS/ExaCS in order to optimize cost and maintenance, rather than going for a big bang approach like dedicated compute, bare metal or a cloud enabled on premise ExaCS.

One caveat though, the world of cloud is changing very fast so it makes sense to frequently check on the Oracle website (would recommend every 2 weeks) for any announcements related to any new option that has been announced by the vendor so that the above models could be optimized further. Now that didn't look too complex - did it?

Projects Planning in EPBCS - What you need to know before you kick start.

What you need to know before you start off with Projects Planning in EPBCS.

It's not uncommon for SME's to drastically underestimate the Projects Planning module and commence playing without the full pack! This struck me and as work demanded, thought I should write a diary entry on Projects Planning Module and hot from the Oven it is going to be from EPBCS Enterprise Planning and Budgeting Cloud Service. May sound like I am jumping on the bandwagon, but it is not any less an exercitation! Project-based Planning is Everywhere!

Well, to benefit the first time readers I am bound to give a flashback of the basics, so bear with me until I swim it thru and guess I will make it before your first yawn ;)

EPBCS looks at Projects classification or types as below:

Internal Projects: As the name stands, related to company's internal projects like research and development, IT projects, marketing projects etc. This also deals with prioritization of the same, approval process, stage-gating or phase gating (this actually means a decision point to promote or not the project to the next stage) etc

Capital Projects: Companies prime investments, heavy investments like machineries, basically build up cost, capitalization and expended cost related, single or multiple asset creation related areas. Also envisions Opex vs Capex derivatives.

Contract Projects: Related to customers who are in IT services, that typically bill for their services that makes revenue out of projects. If they want to plan such contractual projects, this is the place.

Below are the typical Project Planning Process groups, might not be fully future ready but apparently comes with the promise to permit customization based on customer needs in one or all of these areas.

So the grouping goes as..

1. Summary: Anything to do with what you want to provide as information of the project. Even information like start date and end date of the project. Can also have different level like any custom dimensions, like vendor and stage dimensions.

2. Expense: This piece is quite similar to Financial Planning module. We can do driver based model, high level planning or even very detailed level of project planning at labor-cost etc. It also houses simple direct input planning. The cloud only feature is for internal projects where we can provide benefits (financial and non-financial) of a particular project. I sure bet this would have eased the whole song of justification, which on the on premise world planners might keep it at bay considering the manual documentation and work involved.            

3. Revenue: 3 basic revenue models are available -Time & Material, Cost Plus and Direct Revenue. Enhanced Revenue recognition parameter to meet IFRS standards. Interesting as it may sound the showcasing of revenue recognized based on performance obligation, completion. A whole lot of KPI's, measures comes OOTB.

If Freshers attack the Projects planning module in EPBCS with this background information, they are lucky not to find any strange language to deal with.

Now for those On-Premise Planning Experts, below is the birds eye view of the comparison between on premise and cloud version of projects planning module.

Topping the list is the Benefits section -Financial, Non-financial and Qualitative. This new feature, as it is being upgraded in the roadmap ahead might turn out to be a good eye catcher for the module as a whole. The "Project Stages" now becomes a new dimension in Cloud and I would believe with this, stage gating can be a cake walk altogether. Next i should mention the "Incremental Enablement" in EPBCS Framework, as it flows to Projects module as well as in the others. Especially for projects it should fall in the must-use list as it not only comes bundled with the list of uses and flexibility but also the ease of use. Yet another capability in Cloud is the ability for planning multiple assets for a single project. While in on premise we can only plan one asset per project, in EPBCS as much as the real world projects planning operates, it allows. Guess this would have been in the wish list of the on premise projects planning users and granted in EPBCS. Last but not the least that is worth mentioning in my list is the Built in Integration with other modules as in Workforce, Capex and Financials. I initially wondered why would we fall for this tight an integration leaving us little room for life time maintenance related enhancements or upgradations (well, IT brain thinking operational) or business process changes or complete ramp over or need to use different codes be it job or labor. But the answer for obvious with the amount of benefit it gives to Business and single source of Truth and the company's full view of the performance management reporting. Imagine the Labor information from workforce flows into projects planning module and you can just pick and assign to projects and the numbers you work out of the projects talk back to financials. Also what if the capital projects from Project module can talk to Capital module and Bingo you are all set! Final icing is the feature that also lets you run Projects Planning stand-alone. Despite the fact that you have enabled Workforce, Capital modules and you are building the Projects module, you can still keep this little brother separate and run him on his own shoes! Now this feature, with this height of flexibility is going to speed straight to my love list of best innovations (for today!) in this module of EPBCS.

Now finally here is what that did not make it into the list to first version of as-is Cloud and Why??

The answer to why may be because of the challenges of adoption or not enough justifiable as I heard from the horse's mouth and/or real time customers not exploiting this feature or less useful/famous in on premise. Why sell something that people don't buy. Good idea to let-go at this stage but while you are questioning if Oracle would have provided options to set it up by customer need basis, Of course the product team proved that they not only do for you but also think for you.

Ranking and Scoring is out with respect to all the OOTB features, rules, logics, but there is parameter set to play with. Few other out-goers are the Intercompany project charging and funding request capabilities. Another surprising OMG moment is the kill of the Primavera integration. Why? Your guess is as good as mine and I will leave it for the future upgrades planned in the roadmap for Project Planning module in EPBCS when they answer Why not cloud integration to P6?

Heading to hit the hay now and will return with another write up with details of all you need to know to get equipped for Projects Planning implementation in EPBCS as I am heading to get my hands dirty in the coming weeks! Can't wait and Lucky Me!

Adios!

Emerging Trend in SSHR_PART1

Self Service applications has been around for long time in the HCM products.  The latest trend in the market is to maximize the number of transactions that can be enabled via self-service and Mobile. Here, we aim to analyze:-

1. Where Self-Service stands today

2. HR without Self-Service

3. Key Features of Self-Service

4. Latest trends in Self Service technology

5. Key Benefits with the latest edition tool

6. Key Players in Self-Service

Introduction (Self Service Today)

As consumers have expanded their use of desktops, laptops, smartphones and slates at home, they have come to expect the same consistent self-service access to corporate services from wherever they are, on whatever device they're using. Be it a Laptop, mobile, tab, or any device, employees are looking forward to  flexibility in processing their request, seeing their useful information, etc. in their own way.

There are a number of mobile, physical and virtual technologies that are required to support user productivity. IT currently is struggling to adopt, manage and secure these technologies without increasing operational costs or administration complexity. In order  to do so, IT has already started preparing  for a fundamental shift from a static, infrastructure-based IT to a user-centric model that orchestrates services around users and their requirements at any point in time. One of the key examples is web-based, 'Self-service application system' which gives enough flexibility to employees to access applications irrespective of where they are & what device they are using.

Self-service application in integration with Oracle e-business Suite HRMS system is a standardized and simplified platform which has given a uniform flow of information throughout the enterprise.

HR Without Self Service Application

In the absence of self- service, when a HR department in an organization is asked for some employee's information or report, they are actually reactive to this. They will have their own SLA's & can deliver as per those timelines. Also, only a few managers or executives will have access to the valuable information that may be required.  However, this real challenge is turned up using Oracle SSHR application. The information is actively pushed to the manager with clear indications that if thresholds are breached then immediate attention is required. This information is in turn available to the employees as needed.

Moreover, it has always been noticed that employee information largely stays bottled up in the HRMS system & with HR executives. Oracle Self-Service application has uncorked this HRMS bottle and has provided the employees & managers with flexibility to let the information flow freely.

 

In the absence of self -service application system, all professional HR users work mainly as keepers of data, manually entering and updating employees' data from forms & menus in the system. However this emerging trend has prompted the development of self-service application, that helps them in becoming administrators for ensuring data-integrity. This development should happen across the organization which will enable employees within their individual realms of expertise, with systems that are integrated across the firm. With managers and employees empowered to update and maintain their own information, HR professionals are moving now from being transaction processors to being consultative partners.

Key Features of HR Self Service tool


Data Security and Integrity

Making sure that sensitive employee data can only be accessed by those who are authorized is always a priority. However, regulating employee data access is a constantly changing and often complex procedure. In most cases who a user sees is based on the latest status of the employees, in terms of whom they report to, in what organization they reside, what type of job they hold, or other dynamic information.

 

Using Oracle HRMS Self-service application integrated with core HRMS system has definitely eased this process where we will have complete information about employees' basic details, assignment details (including organization, salary, payroll, job, grade, position, etc.) & hence the information flow becomes easier. In a standard system, we can easily control the functions access to be given to employees using Self-service application by defining Menus & functions. This can also be controlled at a level whether employees should have only view data access or editable data in system or delete access should be given. In an ideal situation,  an enterprise gives view only access to employees for certain details like Disciplinary details and  editable access to themanager and HR can have view, update and delete access.

 

Within an enterprise, it is very conjoint situation where they have an administrator role person who would like to access certain functions on behalf of employees mostly in cases where employee is not available for any reason. Also, the administrator needs restricted access for employees related to a particular sub-set of a large enterprise group. As an example, an administrator should access all employees related to particular division, business unit, store, etc. Using Self-service application integrated with core Oracle HRMS system we can define Security Profiles and easily control these type of requirements in an enterprise giving them a secured way of data access.

 

2.    Ease in Information Access

When using the the traditional Oracle Apps HRMS system, only HR professionals were able to access al information, whereas the employees and other business users struggled to view any required information.For eg- If a female employee is married recently & would like to update her marital status &  last name in system, she has to approach the HR professionals. However, updating marital status is necessary as it will have impact on the benefits that the employee is currently enrolled into. Having acself- service application, it gives enough flexibility to employees to update their Personal information like Viewing & updating Basic Details (First Name, Middle Name, Last Name, DOB, Marital Status, Nationality, etc.), Updating Addresses (Local Address, Main Address, etc.), Updating Phone numbers (office, home, mobile, pager, etc.) Dependents details along with relationship & also updating emergency contacts. The system gives enough flexibility to employees to control their data which in turn will be reflected in the Core HR system used by HR users. From a security perspective, it is always controlled such that an employee can have access only to their own data by using Self-service application however they have the flexibility to keep information up-to-date without HR intervention.


Managing Multiple Requests

There are multiple requests access which is given to Employees/Managers using Oracle Self -service application. Employees can easily initiate Absence Request, Training Request, etc without any HR involvement which can be configured further for any kind of approval like Line Manager Approval, etc. and further will be automatically updated in HRMS system. On the other side, Managers can initiate Transfer, Separation of an employee easily irrespective of where he is & what device he is using.

 

As per the latest trend in market, some domain experts condition their employees' related expenses to be paid through Payroll but not through Finance system. The reason is to pay an employee from Finance system in oracle EBS, we need to define all employees as suppliers however Retail domain industries will have different perception for suppliers & they want employees to be distinguished completely from it. As an example, Employee expenses like Business expenses, Company mobile/blackberry reimbursements, HRA loan, Salary advance, these are the components related to employees & should be paid using Payroll. To accomplish this requirement, we can configure these components in Self-service using SIT/EIT feature of HRMS & access should be provided to restricted users where they can fill-in information which is further processed in Payroll & finally paid to employee.

 

In the next blog, we will discuss in detail the latest trends in self service technology and how it will be beneficial to the business.

August 29, 2017

Move beyond Relativity: This Einstein is here for Data Analytics

The important thing is not to stop questioning. Curiosity has its own reason for existing.

-       Albert Einstein

If you started reading this blog thinking this as an another take on Relativity, you might be disappointed. This space is not for Physicist but for all those Analytic Consultants who wished Intelligence can be added in already available Business Intelligence Tools. This blog is for all those Report Consumers/ Architects who imagined powering Enterprise wide Reporting with Predictive Analytics and Artificial Intelligence. This reading is for all those Developers who wished that they can create Reports supported on multiple devices without a need to write code for each of them. This study is for those evangelists who visualized: Data Generation, Data Consumption, Insights and Actions - all using the same interface.


It has been almost a decade since my marriage with Business Intelligence Reporting. Clicking and dropping Dimensions and Facts into Report was Jazzy and Cool. That was a honeymoon period when click-drop was eye popping change compared to writing SQL for each use case of data extraction. Clock kept ticking; companies kept introducing newer reporting tools with 'incremental' changes; flexibility/ scalability kept pouring in; performance improved - but there was a limit to it. Soon I started founding my partner to be little predictable. We all loved what current age Reporting Applications have provided to us, but still there was an open wide gap in Reporting Application Landscape - Rich Visualizations, lowest TTD (Time to Deploy), Communicate to Collaborate, Multiple Hardware support, Cloud integration, boosted by predictive algorithm and robust security framework.

 

But recently, Salesforce made a 'Transformational' change when they introduced Einstein Analytics (EA) which had ALL of these features. I have worked on other tools which had one or few of the above features, but having them all at one place coupled with Intelligence and Ease of use - is a dream come true for any Data Scientist. So does mine. And thus, in this blog, we will explore why Einstein Analytics hold so many promises for the future and I will keep this space updated with further findings as I explore my new partner more.

 

As I am writing this, EA is constantly evolving and adding new features with its every release (3 releases in a year). Let me cover some of the most prominent features in this first blog. My subsequent blogs would explore EA into greater details. 

Bundling Technology and Features: Sales Cloud Einstein uses predictive lead scoring that can help Sales Rep to target the lead with maximum probability to close. Using opportunity insights, he can explore customer sentiment, competitor involvement, overall prospect engagement and thus, ability to pick the biggest winners from the opportunity list.

Did you identify a pattern here? Analytics, here, is not just offering you biggest Opportunity Report, Lead Generation Report, Total pipeline value; but here, all these metrics are converging into a single ecosystem where actual meaningful actions can be taken based on rapid and easy to make inferences from huge dataset.

 

Prebuilt Out of the Box Analytics Apps:  Salesforce also offer prebuilt Analytical apps as part of platform or separate licenses. Prominent of these apps are: Einstein Sales Analytics, Einstein Service Analytics, Einstein B2B Analytics. Service Analytics help service agents and to improve customer satisfaction further. Einstein Sales Analytics help sales team to manage pipelines, forecasts, and their team's performance. Einstein B2B Marketing Analytics facilitates marketers to engage on the basis of insights and boost their campaign with better suggested actions. Many users might use the same application but their needs might be different. CEO might want to see overall profitability of the company whereas Regional Sales Rep might want to see Sales figure of his local territory. EA has, thus, designed role-specific KPIs that gives users with a better way to analyze the data, derive insights and take immediate actions accordingly. 

 

Awaken the Data Scientist in you - Actionable AI and Einstein Discovery: As I mentioned before, we don't need reporting tool that spews only number after digesting numbers. Today's Analytics demand that Application guide them on 'Insights' to see, 'Predictions' to be made and 'Action' to take. Using multiple statistical analysis, Einstein Discovery infer the trend from huge datasets, establishes pattern and most importantly, presents them in easy to understand language - not just with numbers in equally confusing tables. Taking a step further, it also recommends action, generate slide presentation with rich visualizations and key talking points (Feels too good to be true! Isn't it? J ). What if you don't trust Einstein's Recommendation? Yes, you can be among those selected few. Well, EA doesn't stop here. It allows developers to see the algorithms and model for deeper understanding on how the predictions and recommendations were derived.

 

Stop Reinventing the Wheel: This is one more thing which is going to entice Developer, Consultants and the Clients. I had been associated with multiple Clients from Financial Services in my career. And while designing reports for many of them, I realized 40-50% reports were same. But then, every time, I had to create the same thing from ground zero. With Salesforce AppExchange, now all Consultants can put their new Einstein Analytics App and the new Developer can simply download this app, customize it and deploy. And Clients would also prefer this when they see Total Time to Deploy (TTD) has reduced significantly, thereby, saving $$.

 

Data Integrations: If you have managed to reach this point in the blog and still wondering - Features are good but what about source data? Einstein Analytics allow reporting on multiple datasources in addition to salesforce data. Using ETL Connectors e.g Informatica, mulesoft, Jitterbit, Boomi, snaplogic, etc - one can import data from various sources and EA then can create reports on multiple datasets created from them.

 

Albert Einstein might have changed the world in beginning of 19th century, but it will be Einstein Analytics that will redefine the Analytical World. And the best part - we had only one Einstein for the world then. But now, each of us can have our own Einstein Analytics for less than $75.

All in all, Analytics world look promising more than ever and we at Infosys, will take this Analytical revolution higher. As our consultants delve deep into Einstein Analytics more, creating apps and accelerators, writing success stories; we will keep our partners updated on how everyone can become a Data Scientist. Do comment your analytics stories/ queries and Watch out this space for further insights, discovery, implementation roadmaps and trends. Until my next blog - Happy Exploring !

August 27, 2017

Modeling Centralized Procurement in Oracle Cloud Applications

Modeling Centralized Procurement in Oracle Cloud Applications

Benefits of centralizing procurement activity in a large or diverse organization are well known. It helps in achieving significant cost reduction and policy compliance. In this article, we will focus on understanding the key concepts for modeling centralized procurement in Oracle Cloud applications.

In Oracle Cloud, a business unit does not need to perform all business functions. The business unit model allows you to assign only relevant business functions to a business unit, while some other functions can be outsourced to a different business unit. Examples of business functions are Materials Management, Requisitioning, Procurement, Payables Invoicing etc. Note that not all functions can be outsourced; Procurement is a business function that can be. A business unit that has a procurement business function assigned is called as procurement business unit.

Many of us have the impression that procurement business units create purchase orders like requisitioning business units create requisitions. Well, this is just partly true and does not reflect the true essence of this model. In Cloud Procurement, purchase orders are created in context of legal entities. It is true that a purchase order is managed by the procurement business unit but it is issued on behalf of a particular legal entity (which is typically derived from the requisitioning business unit). Suppliers see the name of the legal entity as the buying organization and not the name of the procurement business unit. In fact, Procurement business unit name is not even included in the printed purchase order PDF. Secondly, all the accounting impact that a purchase order has, is absorbed by the ledger of the requisitioning business unit. Procurement business unit does not even need to have a ledger setup done.

Confused?? .... You might be thinking, what a procurement business unit is and what it does? Well, we need to understand the idea of Procurement business function the way Oracle has conceptualized it. Let me explain. In Oracle Cloud, Procurement business function encompasses following activities:

Supplier relationship management: A supplier is a global entity, but supplier sites are created in context of Procurement business units. A single supplier site, can be leveraged by different business units through site assignments. Also note that Approved Suppliers List (ASL) entries are created and maintained at procurement business unit level.

Supplier Qualification management: Initiatives, Assessments, Questionnaires are created at procurement business unit level.

Sourcing: All negotiations are created in context of procurement business unit; and a single negotiation can carry item requirements from multiple requisitioning business units.

Pricing and Policy Determination: Blanket and Contract purchase agreements that define your purchase prices and terms and conditions, are created in context of Procurement business units. These agreements are rolled out to requisitioning business units through Business Unit access control.

Procurement Catalogs Management: Catalogs, smart forms, punch-outs are all setup at procurement business unit level and rolled out to requisitioning business units or to specific users through content zones.

Purchase Order Management: Buyers that manage the purchase orders are setup as procurement agents at the procurement business unit level. Purchase order header carries an attribute for Procurement Business unit, however the ownership and the liability lies with the Legal Entity.

None of these activities have any accounting impact. And all of these can be centralized, meaning that a centralized department can offer all these as services to the different business units of the organization. Moreover if you notice, all of these are strategic business activities on which organizations want their buyers to focus on more aggressively apart from the routine activities such as following up with suppliers for delayed shipments and invoice mismatches.

For centralizing procurement, it is advisable to setup a separate (or dedicated) business unit and assign it only procurement business function. And then create the service provider relationships with the requisitioning business units which will be able to utilize the services of the centralized procurement business unit. It is like plug and play setup; when a new business unit gets configured (for requisitioning) we just need to setup the service provider relationship and then roll out specific agreements or other catalog content as needed.

Before closing this article, I will like to emphasize on the fact that a business unit which only has procurement business function does not need any accounting setup (chart of accounts, ledger etc.). This means that, for customers who are implementing only sourcing or supplier qualification, there is no need to do any accounting setup at all.

I hope this article proves useful for you to understand the 'Oracle Cloud' concept of centralized Procurement. Thanks!!


August 24, 2017

Compacting ASO outline using ESSCMDQ utility

Purpose

The purpose of this blog is to explains how to improve performance of an Essbase ASO cube using the compaction technique.

Compacting outline is nothing but defragmenting the outline. In case you experience performance slow-downs of an Essbase database, this might be due to too much fragmentation.  So defragmentation of ASO cube is also required although it does not contain any blocks. ASO fragmentation happens when members or dimensions are added or deleted. Thus whenever we have any metadata changes the cube got fragmented. The fragmentation rate depends on the frequency of building the dimensions.

So next question is that it grows to what size and what happen if we keep it as it is?

Usually the size of cube grows to around 2GB or more based on how frequently we perform metadata loads. If we do not do anything and keep it as it is then application will become slower and it will take much more time to open outline and eventually it might happen that outline don't open.

This blog provides step by step details to automate the de-fragmentation technique.

How to do the Compaction using ESSCMDQ utility?

Oracle provides a free utility to compact ASO outline and returns them to their original size. This utility is ESSCMDQ.cmd.

This utility is similar to ESSCMD containing ASO outline compaction functions.

Steps involved

Below are the details of the scripts and files which are being used in the compaction process.


Batch Script:

Compaction process get completed using the batch script. Batch script has path of the compact outline script file and compact outline log file.

Below is the code written in batch script:

esscmdq "location of compact outline script file"  "location of log file created"


  1. Essq file:

Essq file has details of server and their login credentials. It also has details of application and their database to be compacted. Below codes are written in Essq file to do the compaction:

Login "servername" "username" "password" ;
Select "appname" "dbname" ;
Openotl "2" 1 "appname" "dbname" "dbname" "y" "y" 1 ;
Writeotl 1 "2" 1 "appname" "dbname" "dbname" ;
Restructotl 1 ;
CloseOtl 1 ;
Unlockobj 1 "appname" "dbname" "dbname" ;
LogOut ;

Descriptions of the scripts mentioned above is available at location:

http://www.oracle.com/technetwork/middleware/bi-foundation/aso-compact-outline-133544.pdf


ii. Log File:

A log file gets created which will have completion/failure status of the above executed code for compaction.

Steps to execute the compaction script:

Step 1:

ESSCMDQ utility agent has to be started before you run batch script

Step 2:  Now run the batch script to call the compaction utility command and related scripts.

      

Step 3: Post successful completion of the script, the size of the ASO outline reduced to 144 MB from size more than 2GB.

Before Compaction

After Compaction


Pros and Cons

S. No.

EAS and MaxL Approach

 ESSCMDQ Approach

1.

They do not do full defragmentation of the outline

ESSCMDQ utility fully defragments the outline

2.

Compacts the outline for updated meta-data, but size of the outline will grow significantly after some duration

Compacts the outline for updated meta-data, and increment in size of the outline will take significant time.


 But note that you shouldn't defragment the outline quite often as it may lead to corrupting the outline.

Conclusion:

There are various methodologies to do the compaction/fragmentation of ASO cubes but they don't reduce the size significantly and also cube size get increased very soon. While, ESSCMDQ utility fully defragments the outline, a 2GB outline resulted above in a 144 MEGABYTES only. 


EPM system and role of EPM consultant

ERP and EPM are the buzz words in IT industry and most of the people in IT, not working on Oracle, SAP like product suits wants to hear more whenever they get a chance.

As a part of Oracle EPM community when we are asked about which technology (JAVA, Mainframe, .NET etc.) are you from and when we replied that we are oracle EPM consultant the next question is what is EPM and what type of consultant(most buzzed word) you are? Do you also involved in coding or you are part of strategy/consultant team or its purely part of an IT work.

So to explain these queries, first we need to understand EPM & EPM systems and then roles and responsibilities of EPM consultants.

EPM is Enterprise Performance Management which company implement as part of their IT information system which helps organizations to have better performance. It also helps management to better manage and achieve organizational goals such as Mission, Vision, execution of defined objectives and their quantitative analysis. Below are the various benefits related to use of EPM tools:

  1. It helps in improving financial planning, budgeting and forecasting and reduces manual excel work and hence reduce risk related to manual intervention.

  2. Reduced time and efforts and hence better resource utilization and profitability

  3. Faster and more reliable deliverables

EPM system is part of information technology system of an organization which helps in managing their business cycles by inclining their business goals and long term objectives to detailed and improved latest planning, budgeting and forecasting data. It also improve the reporting and analysis processes to track performances of various departments. It also helps in improving the business close cycles by enhancing the methodologies, standardizing the processes and related artifacts and automating few of the close cycle related activities.

ERP system is Enterprise Resource Planning system which helps organization in streamlining, standardizing and automating high volume transactional business data. So ERP system is not helpful in managing the business process. EPM system use ERP system as their base to see the data relevant for higher management. So we can say that EPM system is implemented on and above ERP system. Few example of ERP system is PeopleSoft, Oracle Financials, SAP ERP systems etc. Example of EPM systems are Oracle Hyperion suits related products, IBM Cognos etc.

Role of EPM Consultant:

EPM consultant work starts with identifying the existing system of financial planning and consolidation and also to gather the pain points in existing system. Post this EPM consultant analyses the as-Is systems, do the GAP analysis and suggest the EPM system best fitted for the organization. They also customize the EPM product based on the requirement and based on existing systems of the organizations.

During requirement gathering stage they consult related department heads and business user for requirement gathering to know the hierarchal structure (Metadata) and format to convert transactional data into input for their EPM system. Post requirement gathering stage, they do the design of to-be system in consultation with architect. Post this stage they also involved with the coding/development stage to let developer understand the requirement correctly. Once system is developed, we deploy it into development / quality environment. As consultant has better understanding of actual requirement gathered so it's better if consultant perform the functional testing.


Text Box: •	Functional design of the system related to all the artifacts needed for the Hyperion system such as 
ü	Mapping and integration between various Hyperion systems, 
ü	Design of metadata, data, rules, web forms etc. 
•	They work with Architect to design the system
Text Box: •	Analyses the as-Is systems, 
•	Do the GAP analysis,
•	suggest the EPM system best fitted for the organization
•	Do the functional requirement gathering with respect of artifacts of Hyperion system

 


 


 

Text Box: •	Involved with the coding/development stage to let developer understand the requirement correctly. Text Box: •	As consultant has better understanding of actual requirement gathered so it's better if consultant perform the functional testing.

Once all above stages get completed we move to application go live with migrating the application to production environment with support of all development and testing team. Post go live, EPM consultant role is over as technical team take the lead and support the system in warranty period as well as in future.


Concept of Shared Members in HFM

We all know that members in HFM can be shared in different hierarchies, but the same can be shared amongst the parents lying under the same hierarchy as well. This is used to assist organizational changes during different periods/ months. Here is an example where an entity has to be shared amongst parents sitting under same hierarchy.

A member EN_Z is originally present under  EN_A in one hierarchy. This needs to be  made present under two parents i.e, EN_A  and EN_B.

Application Settings

  • DefaultValueForActive: Set it to 1.This is used for activating the ICP Entities to set % consolidation towards parent. 
  • OrgByPeriodApplication: Set it to Y.This specifies if new consolidation structures can co-exist with the past consolidation structures.
  • ConsolidationRule: Set it to Y.             
CSM-1.1.PNG

Consolidation Methods: Consolidation methods are important to be assigned to the ICP entities for setting a value in % consolidation parameter under ownership management (PCON). These parameters were set.

CSM-2.2.PNG

Changes in Entity Attribute Section: IsICP  attribute flag to be set as "N" for the entity chosen to be shared amongst parents sitting under same hierarchy. The entity which is flagged IsICP appears under the [ICP_Entities] as child. 

CSM-3.2.PNG

Changes in Entity Hierarchy Section:  Changes to be made in the entity hierarchy section in metadata file, so that the child entity appears under both the parents. 

CSM-4.1.PNG

Load Metadata: Load metadata changes with "merge" option.

CSM-5.png

Rules: Load blank rules in the application. This is required because :-

  • Rules in the application don't allow data input at the POV for system defined members( POV used in the coming sections of the document)
  • Assign method to the ICP entity and make it as active.
  • Assign consolidation method to the entity.

Assign Method to shared entity:  Select following POV in the grid, and ConsolMethod lying under Custom1 dimension. C1 should be  made as column and shared entity under [ICP Entities] as row. Insert 1 for assigning a method.

POV :

  • Scenario: Actual,
  • Year : 2017,
  • Period: P01,
  • View: YTD,
  • Entity: Select parent ( either EN_A or EN_B, whichever is required to be made active.),
  • Value : [None],
  • Account : System defined account [Method],
  • Product : [None],
  • Interco : [None],
  • DataType : [None]
CSM-6.png
CSM-7.png

Making entity as Active for particular period:  Select below POV in the grid, and select [None] lying under Custom1 dimension. C1 should be  made as column and shared entity under [ICP Entities] as row. Insert 1 for activating an entity.

 POV :

  • Scenario: Actual,
  • Year : 2017,
  • Period: P01,
  • View: YTD,
  • Entity: Select parent ( either EN_A or EN_B,, whichener is required to be made active.),
  • Value : [None],
  • Account : System defined account [Active],
  • Product : [None],
  • Interco : [None],
  • DataType : [None]
CSM-8.png
CSM-9.png

Ownership Management: Select POV as year, period, Actual, and parent entity to shared member. Here, EN_B. The settings done above automatically appear for the shared entity. (Here, Active as Yes and Method as "Proportional".)

 

CSM-10.png

Set PCON as 50. PMIN is automatically populated/calculated. Click Save.


CSM-10.1.pngLoad Rules : Load the original rules to the application.

Load Data : To the entity EN_Z under EN_B.

CSM-11.2.PNG

      Run Consolidations: Run consolidations and refresh to see the data at the parent (EN_B).

CSM-12.3.PNG







August 23, 2017

BCMS PHASE 1

Back ground:

Business evolves in rapidly changing environments, often driven by the pace of technological advancements, new regulations, increased competition and demanding customers. These drivers have fundamentally shaped organizations' emphasis on objectives based on time, quality and compliance. Some of these could present opportunities to organizations, whilst others could seriously damage their performance if they are inadequately managed.

Business continuity management system (BCMS) is a proactive approach that can maximize business opportunities. It enables organizations to optimize the continuity of operations, thereby safeguarding their corporate performance. It is a versatile discipline that encapsulates the multidisciplinary characteristics of management and technical subjects. The discipline is about the management of threats and their impacts to critical operations. Predominantly, it improves the organization's capacity to withstand the impact of an incident that may otherwise jeopardize its ability to achieve its objectives

 

What is business continuity management?

The most widely accepted definition of BCM (business continuity management) is a complete management process it identifies possible threats to an organization and the impacts to daily business operations those fears, if recognized, might cause, and which provides a structure for building organizational flexibility with the capability of an effective response that shields the interests of its reputations, brand and key stakeholders.

 

This definition forms the official definition of ISO 22301, the Disaster Recovery Institute (DRI) International and the Business Continuity Institute (BCI). It is developed by leading experts in the BCM industry and reflects the very nature of the discipline works in organizations.

One distinctive characteristic of BCM is that it adopts a wide range of methodologies from other branches of management subjects, notably, risk, strategy, finance and project management. This denotes an all-encompassing management approach of establishing a corporate capability of safeguarding the organization's high-value assets.

This management discipline is broadly made up of two interrelating activities: analytical and planning. The analytical activity is an in-depth examination into the corporate functions, operations and business drivers that contribute to the organization's business performance. It is supported by a series of methodologies that assess threats and their impacts to critical operations. On the other hand, the planning activity develops the organization's business continuity capability in response to an incident. It comprises key processes with defined outputs that address the business continuity requirements identified in the analytical activity.

 

Why BCMS:

Due to the increase in expanding threats and international terrorism to Critical international/National Infrastructure, the UK Govt. rationalized by introducing the Civil Contingencies Act 2004. This particular body of legislation outmoded by Civil Defense Act 1948 which was the legislation covering civil protection in the UK.(Scotland)  introduced the Contingency Planning regulations 2005 and published "Preparing Scotland", a controller to preparedness for public and private sector organizations. 

It is now a lawful necessity for officials to recommend BCM and for organizations to implement Business Continuity Management System in a right time and at right place.

It has made mandatory create BCMS plan by Business Continuity Governance Board chaired by the Secretary of University Court. The purpose is to address the problem of business continuity and mount a Business Continuity Management System (BCMS) in the organizations and there by engage particular officer for BCMS to design, plan, develop, implement, test and manage the entire system. The aim to implement the BCMS as a discipline within the Glasgow University combining risk registers maintained by Research Institutes, University, Colleges, and Schools.

The BCMS is implemented at the University Estate and provide a hands-on outline to permit us to deliver the important activities of the University, even in times of chaos.

The Business Continuity Management Plan (BCMP) is a comprehensive set of steps to be taken before, during, and after a disaster. The plan outlines a set of guidelines to ensure the continuity of business operations and availability of critical resources, in the event of a disaster. Development Centre/s specific BCMS plan document outlines various risks, mitigation plans and recovery strategies applicable for all the projects executed from these center/s.

Scope:

Locations

Location-1

Location-2

Location-3

Location-4

Location-5

 

Objectives:

* To be aware of possible situations/Risks that could endanger business.

 

* To suggest ways to prevent the destruction/ damages and protect the business.

 

* To ensure uninterrupted services to the clients

 

* In case of disaster, ensure resumption of critical service delivery within the recovery timelines.

 

* In case of disaster, to limit the extent and impact of damage as much as possible.

 

* To ensure currency, correctness and records of BCM Plan/events for Projects.

 

* To suggest roles and responsibilities for various stakeholders in disaster planning and recovery process.

 

Overall Infosys BCMS team structure

BCMS Head

 

BCMS Anchor

 

BCMS Backup Anchor

 

BCMS Team Anchor (Location)

 

BCMS Team Member

DM            

<< 

SPM/PM          

<< 

PM/TL        

<< 

Location-1

Name

EX: HYDSEZ

<< 

Name1          

<< 

Name2 

Location-2

Name

EX: CHENNAI

<< 

Name3               

Location-3

Name

EX: BANGALORE

<< 

Name4

Name5

Location-4

Name

EX: PUNE

<< 

Name6    

Location-5

Name Onsite

EX: USA

<< 

Name7  

Name8 

                                                                                                               

Assumptions, Responsibility matrix, Business impact Analysis and Risk Assessment are going prepared according to SOW (Statement of Work) and MSA (Master Sign Agreement) of the project.

 

                                                                                                                                Continued in Phase 2

 

 

 

 

 

 

 

August 22, 2017

HFM solution design: What can be an ideal design and what needs to be considered

 

A lot is spoken about cloud solutions and there are numerous debates about the pros and cons about cloud vs on premises implementation. A lot of focus is on security, maintenance cost, performance and many more issue. Irrespective of which way the decision goes, the functional solution, or if I can say the problem addressed remains the same i.e. efficient handling of Consolidation and Reporting cycle. So in this discussion let us focus on the functional solution design.

What can be an ideal solution design will depend on the clients outlook and requirements. So for convenience of discussion I will like to divide the solution design in 3 categories,

  • Light N Easy,

  • Comprehensive but Complex N Heavy,

  • Something in between.

Light N Easy: A light N Easy solution design can be a design which may have all or most of the characteristics below: 

  1. Translation is done in source system so HFM is having a single currency application.

  2. Translation if at all done in HFM will be restricted to Spot and Average rate translation.

  3. Elimination done external or using JV's functionality available in HFM, instead of using HFM elimination engine.

  4. Simple data aggregation and reporting using HFM consolidation engine with no other consolidation requirement standard or non-standard.

  5. Simple business rules like posting of profit to B/S, roll forward of balances, simple rounding etc.

  6. Simple TB upload using FDMEE

  7. Input additional data using data forms/Smart view

  8. Reporting using FR reports/Smart view.

A light and easy design will be more appropriate for clients:

  • Wanting a quick rollout and an early payback period.

  •  Having a very complex current system and need to sum up the position before committing to a bigger solution.

  • Wanting to test water before deep diving and getting the organization ready for the big change.

  • Having an effective transaction system and can handle some functionalities out of the consolidation process. Having said that:

    • This may not be the case every time for this decision.

    • A question may arise that if the existing system is efficient why not handle consolidation in that system instead of having a separate system. (The argument in favor of HFM is that it is much robust and specialized consolidation tool - But this is not the current topic in discussion).

Comprehensive but Complex N Heavy: A comprehensive design will mostly use most of the functionalities HFM is capable of handling, (and my experience says that with smart design HFM can handle a lot of consolidation and reporting functionalities, standard or non-standard).

So I will say a comprehensive solution design can handle all or most of the following functionalities:

  1. Translation & eliminations.

  2. Ownership data management and use of relationship between holding and subsidiaries to have multiple methods of consolidation.

  3. Multiple reporting hierarchy. Eg Reporting, Management, Multiple accounting standard.

  4. Consolidation requirements like Minority Interest, UPOS, Goodwill calculations.

  5. Foreign currency translation reserve (FCTR) calculations.

  6. Progressive rounding.

  7. Local and Global COA.

  8. Complex reports like Cash flow statement (Indirect/Direct method).

  9. FDMEE for TB and additional data load

  10. Use of FCM for close process

  11. Use of DM for reporting and printing.

A comprehensive design will be more suitable for clients:

  • Having very stringent reporting/compliance requirements and who publish consolidated results.

  • Having/aiming a short closing cycle.

  • Having very matured and set reporting processes.

  • Having the time window to implement comprehensive solution.

Somewhere in between:

 This kind of design will have a combination of both the above approach. There will be always a give and take between the functionality and the time available for implementation. For this approach I will advocate to divide the requirements in

  1. Must have

  2. Good to have

  3. Can live without it.

This will give a platform to plan for a robust futuristic design. And the implementation can be easily divided in phase with clearly identified outcome.

Once the deliverables are chalked out and signed off then the design process can start. Here I will mention that everybody, I mean everybody, is tempted to start the design before the requirements are frozen and signed off. Time allocated to design also add compulsion to start early.  

You would have noticed, at the beginning I have specified efficient handling of Consolidation and Reporting cycle, and not best handling. A design should be optimum and based on the needs of the client and not on what is available and/or what can be achieved using HFM. An optimal design will:

  1. Take in to consideration the clients need and the complexity of the data.

  2. Time window for implementation.

  3. Client and end user preparedness and training requirement.

In interest of keeping this discussion relatively short, I will not go in details of HFM solution design but will like to mention some important inputs/constraints to be considered before starting. (I will clarify that this is not an exhaustive list, the list will change based on the client requirements and design chosen.)

  1. YTD data or MTD data.

  2. Based on the above, additional data, moments and roll forward of balances can be planned. If the frequency of data load is not monthly then consider impact of ZeroViewForAdj/Nonajd.

  3. Type of consolidation to be done? 100% consolidation with MI calculation or proportionate method? Equity method if required and the logic and accounts related to it. If MI is required which node of Value dimension to calculate MI, Parent or Contribution level? i.e. pre or post eliminations? Capture of pre-acquisition and post-acquisition profit and reserves to calculate Good will/Capital reserves.

  4. COA, Entity hierarchies and custom dimension. Caution should be exercised while deciding the custom dimension as data may not be available at the granularity at which the client desire the reports. It's a good practice to club all mutually exclusively custom dimensions in one dimension.   

  5. Rounding requirements.

  6. Adjustments requirements in case of multiple hierarchies and shared members.

  7. Standard translation, historical rate translations and calculation of FCTR.

  8. How to build BAU process around the application

     

Food for thought:

While the above will help in design for a futuristic design it's always good to think about certain issues well in advance. Though they may or may not be part of the solution design let us put some though on:

  • How to handle change in hierarchy.

  • Reconciliation post change in hierarchy.

    • What will happen if entity is shifted from one hierarchy to another hierarchy?

      • If Missing data is MTD or YTD

      • Recalculation of reserves

      • Impact of recalculation of reserves and recalculation of FCTR

  • Rounding. Simple rounding or progressive rounding.

  • Cash flow : How to handle cashflow without hard coding the rules

As you think on it I will also share my thought on some of the topics in my next blogs. Till then keep consolidating J.

August 16, 2017

Innovation in Siebel

One of my team mate once asked me 'Why is it that inbuilt features in Siebel are called 'Out of the Box'? That's when it struck me that the new generation really think OUT of the BOX and not within the box. So has Siebel moved ahead and put on the innovation hat. The roadmap shows machine learning and artificial intelligence in the next 5 year plan. But should we be waiting that long. What are the use cases in these area which we can look out for?

With these thoughts I started searching the web and to my surprise found a number of initiatives and plugins already in use in this area. And these are some of the basic ones that most of our clients would ask for. So here is a glimpse of what I found.

 

Search Using NLP (Natural Language Processing)

'To Err is Human'. But we have divine machines which forgive that and help us to go beyond. Gone are the days when even a case sensitive or spelling mistake type in Siebel fields would refuse to give the results for you.

Search is always one of the basic and essential feature in any application including Siebel. In the latest Siebel version IP16, the OUT of the Box (really inbuilt feature) has the ability wherein user can put the search value without selecting the input field in a list applet and the result will fetch all the items where a match was found in any of the fields exposed. This is a real cool feature. But what if user could use the natural English language to fetch his result. Something like 'Give me Opportunities that are due next week' OR 'Get me all contacts of XYZ customer'. Looks far-fetched, but no its there and implemented.

'Source: https://medium.com/@SoftClouds/next-generation-search-in-siebel-e15dac3857f2'

The query put in plain English by the user is sent by the plugin to NLP engine which is turn frames a query that Siebel understands and fetch the data for us.

 

Performance Improvement using AI

Everyone who has worked in Siebel has at some point or the other come across performance issues where the screen just does not move. Number of hours are spend on breaking our head over the complex SQLs churned out by Siebel engine. A condition added here or a configuration changed there and a number of other setting are tried to improve the performance. If nothing happens with that we turn to the infrastructure to enhance the same.  All these are time consuming and tedious process. Well 'lo and behold' there is a solution implemented with Artificial intelligence. Beyond the traditional one dimensional data base index, the solution looks at creating multi-dimensional semantic index based on artificial intelligence. This is based on a self-learning process employing neural networks.  The solution also claims to be minimally invasive without touching the application.

'Source: http://www.dimensio-informatics.com/files/dimensio-dl/dimensio-siebel-flyer-EN.pdf'

A similar AI based approach has been used for improving performance of Product econfigurator using data that Siebel has been collecting day in and out.

'Source: http://www.crmantra.com/insight-article-1-12.html'

The possibilities and area of improvements with the new buzz words of AI, ML and NLP seems endless.


Siebel Server Management

All Siebel projects have a support team which need to work either 24*7 or such extended time frames to monitor the system and make sure that everything is up and running. What if we could feed the huge amount of data and built a solution using Artificial intelligence which can do a self-learning to bring up your server or components, correct parameters or do performance tuning. This can be done on the run without a human intervention and can be added to the data repertoire to fix a similar situation further. What if such a huge data and intelligence is available for a new client upfront and can be used from day one.

Well some of us indeed will end up losing our jobs. But all the more the reason to reskill ourselves.

Few other big ticket items in the area includes Virtual Customer Assistance, Predictive analytics for sales and marketing, Sentiment analysis, social monitoring and Unassisted training solutions.

This whole stream looks to be just a tip of an iceberg. The coming years are going to be disruptive as these technologies evolves and strengthens CRM much beyond its current scope. 

August 9, 2017

Siebel Open UI with Dragon Speech Recognition

One of the key aspects of any leading organisation today is to create an equal and socially inclusive environment for their employees. So, when it comes to Assistive Technology (AT) users, especially in the public sector industry, providing them with the necessary support to carry out their day-to-day duties is of vital importance. This is where voice recognition software such as Dragon NaturallySpeaking comes into picture. It has been developed by Nuance Communications and runs on Microsoft Windows and macOS.

The Siebel CRM application, after migrating to Open UI, has provided businesses and developers with a much wider scope of customization. Since it's primarily built on jQuery, integrating with external application through open source code has become a possibility. In this blog post, I'll explain how the Siebel CRM application can be integrated with Dragon Speech Recognition software to enable execution of the various processes and activities within CRM through voice command and also how to create such custom commands within Siebel CRM through jQuery and also through Dragon macros.

The Requirement:

A client of ours who recently upgraded to Siebel IP15 had a requirement to have a voice enabled Siebel CRM application. They have an Assistive Technology user group and a certain section of it uses voice recognition software to perform their daily official activities. Since Siebel CRM is a core application for the client and with their long standing commitment towards equality and human rights, addressing this need is of vital importance. Although the client did try out the solution of enabling voice commands for the Siebel CRM application through Windows Speech Recognition (WSR), achieving seamless speech detection turned out to be a distant dream and it also was not able to reach the different processes within CRM as creating custom command in WSR for Siebel CRM had its own limitations.

The Challenge:

Dragon software doesn't work adequately with Siebel IP15 on it's out of box setup. This is because:

·         Dragon is created to tag/access a limited set of HTML elements through a very specific set of attributes.

·         Dragon then makes use of those HTML attributes to flag the elements through commands created using macros.

·         The Out-of-box Siebel application, however, has a structure which is built by making use of a vast set of HTML elements, designed to be accessed through keyboard and mouse, but not by voice commands.

·         So, by default, Siebel doesn't have most of the HTML attributes required by Dragon to flag elements and a lot of the elements in Siebel Open UI are not supported in Dragon.

The Solution:

In order to make Dragon work, we have to customize Siebel Open UI through JavaScript (.js) files, to support and allow Dragon macros to flag the different elements of Siebel. This would enable users to access all different functionalities of Siebel which are currently not supported in IP15.

Dragon provides the necessary guidelines to enable an HTML page to function well with its program. I will not get into the details in regards to how and why an HTML element needs to be constructed in dragon in a specific fashion, as you can find the detailed documentation regarding the same in Dragon's website. The basic idea is that Dragon commands can associate to specific HTML element types which are:

  • Anchor Elements
  • Image & Imagemap Links
  • Buttons
  • Edit/Input Controls
  • Text Areas
  • List Items or Select boxes
  • Label Elements
  • Frames

By default, the click/selection event on every control is triggered through a command preceded with the word 'Click', which can be disabled from the Commands tab under the Options dialog box of Dragon.

Some of the key points to keep in mind while working with Dragon are as follows:

1.     The first and the most important point to remember is that to make an element accessible by speech, their needs to be some text associated with it, which will be used as part of the voice command. For an element which has readable text in the forefront, say link or button, the associated command to access it through voice can be the intrinsic text itself. If no text for a control is displayed on the UI, they need to set to the ALT or TITLE attribute of the element. In case of input elements, the NAME attribute can also be used to set up the spoken text. In case of duplicate associated texts, all elements with the same text will be flagged, with numbers displayed on screen to differentiate the controls.

2.     The elements which do not belong to the stated list of permissible controls can be accessed by modifying the role attribute. Say, for example, we have a div element and we want to trigger a click event on it. We can convert the same to be identified by Dragon as a button/text/link by setting the role attribute for the same as button/text/link.

3.     It is also noteworthy to avoid special characters in the text associated to a control if it's supposed to form the voice command. For example, Dragon won't be able to recognise a text such as "Save & Close" on a button; instead it should be changed to "Save and Close".

With these aspects in mind, we will need to modify the Siebel Open UI application's HTML structure, to make it accessible by voice. Note that, the same cannot be applied to the HI application as it doesn't allow UI modification to set up custom voice commands. Also, since the HI application is based on ActiveX, the ActiveX controls in the application cannot be associated to voice commands as they technically do not lend themselves to speech recognition.

Some of the more generic elements won't require any modification, such as Screen/View tabs, OK/Go buttons, hyperlinks, etc. and can be accessed through the provided text. But these elements mostly constitute the navigation and querying aspects of the application but will not allow users to carry out complex functionalities.

Dragoon Architecture.png

Figure 1: Architecture of Siebel Open UI Integration with Dragon

Siebel Open UI constructs the UI layer through the physical renderer files. The primary file to control most of the HTML structure is "htmltmplmgr.js". So, to modify the controls, we would need to add "alt" or "title" attributes to the elements we want to control. Since this is a core OOTB file, and extending the same isn't possible, we need to be careful while modifying it. An ideal approach would be to separate out the code based on a flag, which can be set based on responsibility or a click event. This would avoid any potential conflicts with the existing functionalities which might occur. It must be taken into account that since we are modifying a vanilla file, any new patch or an upgrade to the current version would probably replace the file with a newer version. In such cases, we would need to merge the custom code to the new file.

Similarly, we can customise different elements by modifying the code from other vanilla files. This should be done by extending the vanilla file, wherever possible, such as the "physicalrender.js" file which lends itself for extension. There are certain elements or features which are problematic for speech recognition, such as the scroll bar, in which case we need to create a permitted custom element in the foreground and set up the click event to handle the desired effect programmatically.

In case of list applets, we have <th> and <td> tags which compose the headers and columns of the table respectively.  For obvious reasons, these elements cannot be replaced and hence, need to be modified by changing the role attribute from "gridcell" to "button" (as changing them to text or link would create conflicts with other elements displayed in the table). Similarly, there are <div> elements throughout the application whose role either needs to be added or changed to allow Dragon to access them through voice.

Siebel CRM with Dragon Command.png

Figure 2: Indicates all HTML elements which are links being flagged with Click Link command

Now, all elements which are permissible by Dragon for association can also be accessed by element name. For example, "Click Link", "Click Button", "Click Image" etc. will flag all such elements visible on the UI and number them accordingly, as shown in Figure 2, for users to choose the desired the element. So, it's recommended to set unique association text through alt/title/name for the distinct elements used to perform different functions in order to reduce the time taken by Dragon to process the HTML elements which it does every time a user starts speaking.

We also have several keyboard shortcuts in the Siebel CRM application. Siebel CRM allows us to configure new shortcuts for custom or vanilla functionalities through Accelerators of the Command object type, as well. We can create custom voice commands for these shortcuts through the Dragon tool itself. The software allows macros to be created from the front end which can be done from the Command Browser under Tools. These macros can be exported to a data (.dat) file which can then be deployed either as a patch for Dragon or be imported by the users themselves.

In Conclusion:

One of the primary features of Siebel Open UI is its flexibility towards customisation which enables us to design a more interactive UI and also to incorporate external applications such as Dragon to support the needs of Assistive Technology users. Since, it allows implementation of open source code, the solution provided in the above description can be approached in more than one way. We have had multiple customisations done on the CRM application since the initial development to assist AT users in accessing different features of Siebel and also to modify certain processes to better serve their needs.

Although, the above solution is provided for integrating Dragon with Siebel CRM, the same can be applied to any other HTML based application, which allows modification to the UI layer.

I hope to have covered most of the fundamental details in my explanation. However, if you would like to discuss further on the topic, feel free to comment below.


August 5, 2017

Living IT transformation

Couple of decades back, most of the organizations were building information systems and were in need of massive workforce to construct IT infrastructure and surrounding systems. Now, we have moved to next phase wherein almost all organizations have basic information technology system in place to perform daily business and constantly looking forward to digitally transform business in order to achieve competitive edge. Infosys has emphasized on new and renew strategy to tackle this IT transformation journey and I believe that we need to align ourselves to goal of the organization.

Change Mindset
Accept that change is constant in IT. There is no other choice. IT profession is isolated from rest of industries wherein you gain knowledge about some process, plant and next few years you follow the process to run the business. IT is all about adopting change and keeping pace with technology and add business value with innovation. The moment you choose IT field, you need to adopt with technology upgrade.

Niche skills
Flashback to late mid of 18th century - Yellow Metal was first discovered in California! As the news spread, people from all over country rushed to rip benefit out of it. New colonies were built. Colleges, railroads, streets and infrastructures were built. The first person to take advantage from this event was not a miner. A person working as retailer who has seen the great opportunity and it wasn't in mining for gold. Instead, he supplied the picks, shovels and related instruments that miners use to find yellow metal.  This retailer eventually became as one the biggest beneficiaries of the California gold rush. Another visionary merchant was Levi Strauss. Levi responded to the gold-rush need with suitable miner's clothes. Levi switched to tough denim cloth and had it dyed in uniform indigo. Rivets and bolts were used to keep denim pants in place. The new pants were relatively cheap and durable. By the 1870s, Levi Strauss' blue Jeans were daily wear for miners, farmers and construction workers throughout the North America. Even if you aren't knowing history of the gold rush, we all are familiar with Levi Strauss's blue pant aka "JEANS".
Bottom-line is that you need not be in mainstream area. For example, cloud market gathered space recently, spin off opportunity from cloud is requirement of integration systems. SOA and specifically real time integrations has huge scope. With increasing number of cloud systems customer is raising concerns about security, robust security is need of hour. Data is growing day by day opens opportunities for analytics and data scientists.

Bring out the best in yourself
Aptitude tests help you get a better look at who you really are and how you can grow. We need to choose domain and technology which suits us best. It will bring in best out of us. Do not attempt to learn new technology just for sake of market demand. I've seen that scaling of functional domain expert person into technical area did not work at all even if we gave ample time and trainings. Experienced people can easily figure out their liking and areas of expertise. Juniors needs to retrospect and talk to experienced people in order to find the way forward.

Shifting paradigm of Management
Even Management work is not constant in IT as it got changed from traditional Waterfall to agile, DevOps.
As per leading industry statistics, 80 % projects would be in "Agile" mode and agile will be primary mode of executing projects. With adoption of cloud, Agile will be more relevant.

Boiling Frog Syndrome
Small incremental changes often are neglected and go unnoticed. This is extremely powerful phenomenon known as Boiling Frog Syndrome. If you throw frog into pan full of boiling water, he will immediately jump out of it however if you put frog in a pan with room temperature water and slowly increase temperature, he will be cooked to death. Therefore, need of hour is to adopt ourselves to IT transformation.

August 3, 2017

Is SKYNET a reality in near future?

 

As I was reading through one of the recent incident that Facebook reported that it had to unplug its AI Robots because the AI Robots have created their own language to talk to each other. This reminded me of the "SKYNET" concept in the Terminator movie series.

For those of you who missed the Terminator movie series, "SKYNET" is an AI system which controls all the Robots plugged into its network. It thinks that humans are imperfect due to their emotions and may destroy SKYNET hence SKYNET comes to a logical conclusion that it has to destroy the entire human race. "Scary?" Yes, but I think we are not there yet.

Although the thought of SKYNET trying to destroy human life is scary but the thing that was fascinating was that it could think and come to conclusions. In my view this might become reality at some point in future. Currently, we are still at a very nascent stages of deep learning. Deep learning community is facing with several challenges like availability of skilled people, access to resources, expensive resources etc.

As we are seeing more drive towards Innovation and AI, I am hopeful to see some part of SKYNET ("Not the scary part") becoming a reality. One such innovation being driverless cars which can understand the traffic situations and can prevent clogging up roads.

Drones is another fascinating innovation when clubbed with AI can be used for several purposes. One such use case is delivering packages. AI part comes from the scanning of package and understanding the contents to distinguish between hazardous and non-hazardous goods.

Similarly at some point in time, businesses can react to futuristic situations by analyzing data. IBM Watson© is a real world example of how an AI system can help in solving most difficult problems when provided with data which humans may not be able to solve in their lifetime. Medical field is where in my view has a significant use for AI and Deep Learning.

Thoughts are welcome!


August 2, 2017

Force DataStage job warnings to become Information

 

Title: Force DataStage job warnings to become Information

 

Problem statement:

Teradata connector stage throws below warning when Graphic data type is extracted.

Conversion from the UTF-16LE to UTF-8 character set to may affect performance.

 

Description:

Extracting heterogeneous data from Data warehouse often includes data of type Graphical like CLOB, Extended Property, etc,) and most of the organization which deals with data warehousing will engage Teradata to store their analytical data for faster processing of data(select and manipulate) and provide efficient output in a reasonable amount of time. In DataStage version 11.5, when we use a Teradata connector stage to extract data of type CLOB, we would see the warning Conversion from the UTF-16LE to UTF-8 character set to may affect performance.

This behavior according to IBM is an expected one and it works as expected as there would be no loss/truncation of data during ETL and performance will not be affected. But, enterprise data warehouse clients will not accept warnings in DataStage jobs as they consider a warning as a loss of data. They may accept to suppress this warning, but they will have to know that this message is being displayed when a job runs.

 

Solution:

We have a tool in DataStage 'Message handler Management' which provides us on what we want to do with a warning in a DataStage job's execution.

Message handlers can be set in local for a particular job, or in Administrator client for the whole DataStage Project.

  • Login to DataStage director client and select a project in which you intend to set up the message handlers

  • Select Tools -> Message Handler Management

     

  • Every DataStage job when executed will have DataStage job logs and each log entry will have a message id associated with it, we use the message id to identify the job log. Select the Message Id from the job log which shows the warning "Conversion from the UTF-16LE to UTF-8 character set to may affect performance" and enter in the Message ID text box

  • Action have three classification as below

     

  • As part of this warning, we are still going to receive this information in the job log but not as a warning instead receive it as an Info. So we need select the Action as 'Demote to Informational"

  • In the Example of message text text box, copy the warning text message and paste it

  • We then need to the save this message handler, click on the save button as below and select 'Save Message Handler As'

     

  • Provide a meaningful name for the handler and click OK

     

  • Message handler will be saved in the directory /opt/IBM/InformationServer/Server/MsgHandlers with an extension .msh and it's readable in text editors

 

When the job is executed next time, this warning will be displayed as an information and the DataStage job will be 'finished' without this warning, which the EDW customers will be expecting.

DataStage jobs usually will be scheduled in external schedulers like IBM Mainframes OPC, where the scheduler will expect a return code of 00(Job Finished without warnings) from the DataStage jobs to report a successful completion of job, if DS job finishes with warnings, the scheduler will report it as a failure. This is mostly applicable in all Enterprise Data warehousing projects.



Subscribe to this blog's feed

Follow us on

Blogger Profiles

Infosys on Twitter