Infosys’ blog on industry solutions, trends, business process transformation and global implementation in Oracle.

« January 2018 | Main | March 2018 »

February 28, 2018

Managing big teams :24X7 support

Dear Friends, 

There is a famous saying 'Teams of ordinary people achieve extraordinary results'. This saying holds true in achieving the results always, especially, when it comes to managing big teams. The first and the foremost strength of the team is communication. If communication breaks, then, things start falling apart. From disrupted client deliverable, to the bad rapport amongst the team-members. Even the biggest of the space missions fail would at certain points, when people stop talking. 

Few key points in handling big teams

There might be number of challenges in handling the teams, when they are operative in 24 X 7 kind of model, but, everything is achievable if some thinking is put into it. Here are few corrective and constructive measures to be applied.

Meetings:  Daily meetings must be organized to sum-up the day's activities which were handled by different shifts. What-went-well and what-went-wrong should be discussed in depth. But care should be taken that the meetings should not eat a productive part of the day. People have best energy during the morning hours or start of the shift. 

Hand-Overs: Shift hand-overs should be properly done. The progress can be shared with the entire team via emails. PMs and Leads should define a process of the shift-hand over. Well-defined shift-handover emails / progress emails should be shared by the shift leads with entire team. A properly delegated task works wonders in achieving the desired results.

Technical Leadership: Team should be built strategically with correct number of technical leadership to support appropriately, in times of distress. Improper leadership can deface the name of an organization easily.

Resource Planning:  Resources should be hand-picked, so that head-count be divided wisely across all the shifts, without impacting the resource availability and work. Resources with versatile skills should be equally distributed. It's advisable to divide the team in small groups for better output.

Skill Planning/Productivity Check: It's advisable to get monthly skill check done for all the candidates working in the team. This ensures that team is well equipped based on the need of the hour, changing technologies, client commitments and expectations.

Project Management Activities: A PM carries the responsibility on the shoulders to build the team, maintain CMMI accreditation of the project etc. Quality assurance, resource planning, escalations are to be handled by the PM.

Small Steps Towards Milestone: There should be realistic and achievable milestones. Check-points for milestones, play an important role in keeping the things moving. This also ensures that motivation level of the team is up and doesn't fall apart. Client should be aware of the check-points and should be in agreement with the same. They should be kept informed in a timely basis.

Communication and proper planning is the key factor for better team management. Even "Impossible" says "I-M-POSSIBLE". Most impossible tasks will become possible if proper steps are taken. Hope this article helped you plan your teams better. Let us know in the comments .

OBIEE Integration with Tableau

         In this  blog I will discuss and show how Tableau - OBIEE integration can be achieved and  options to connect to several data sources from Tableau.  It's a frequent ask from clients where OBIEE and Tableau present in reporting area.

OBIEE and Tableau are known as Business Intelligence Reporting tools. It is possible from Tableau to connect to OBIEE metadata file, build reports and provide advance data visualizations using Tableau.

What it Needs?

OBIEE Client tool installation in Tableau Desktop System

OBIEE application details to connect from Tableau

How to achieve?

  • Create a ODBC DSN in Tableau desktop system using driver: Oracle BI server.  Provide OBIEE server name and port details and test Cluster connectivity


  • Create a Tableau Data Source connector(Tableau.tdc) file with the specifics as provided below

HariTdc.jpgReporting from Tableau by using ODBC Data Source

Tableau allows to use multiple data sources for a single worksheet. The below provided are the section steps for connecting to ODBC data source from Tableau; to be followed creating ODBC DSN.

Step1: Launch Tableau Desktop

Step2: In Tableau Desktop, click on Other Databases (ODBC)


Step3: In DSN drop down,you will find the DSN you already created. Select the DSN you just created. Click Connect

Step4: There will be a popup screen appear. In Login window, provide credentials to connect to. Click OK Button


Step5: On providing correct credentials, It shows window with message: Connection has been established successfully with ODBC Data source

Step6: You can create data model in Tableau Book by selecting the required OBIEE Presentation Tables that appear on left navigator. And you continue to build the report

Reporting from Tableau using Multiple Data sources

Tableau allows to use multiple data sources for a single worksheet. One data source is primary; all others are secondary. 

The below are the Key Data sources:

  • File-based data sources

  • Relational data sources

  • OLAP data sources

  • Cloud-based data sources

    In this blog, I'm discussing how we can use File/Relations data source in Tableau

  1. Creating a Data source to connect to a File: 

    The below steps are to be followed to connect Tableau to a file data and set up the data source.  The below steps help to establish a connection to text file. 

  • Launch Tableau Desktop


  • As shown in the below screenshot, select Text File to connect to a text data file,  In case you want to connect to Excel file, select Other Option : Microsoft Excel


  • In the next screen, select the file path from network

  • In the next screen, select the Connection Type: Live /Extract
  • When Live connection is selected, a direct connection to the given file path will be established

    Then select Goto Worksheet to go to the work book to do Reporting by selecting the fields from the connected file. Any update in the source file reflects in the Reports accessing this Live connection.


      When Extract Connection is selected, it prompts to save the file in a local location, So that the extracted data will be stored in .tde file.

    The above .tde file can be used in any Workbook to work on the data available in the local Tableau data extract.  This step needs be repeated if there is any data refresh in the source file.

2.    Data Source creation to connect to a Database Table:

The below steps are to be performed to create a data source to connect to database tables. The given example illustrates steps to connect to Oracle Database.

  • Launch Tableau Desktop

  • Select Oracle to create Oracle Database connection


  • Provide the Connection details to connect to the Oracle database


  • Click connect, Select the Schema Name, Table Names. When Multiple Tables are selected, provide the join among the Tables.  Click Go to Worksheet, where the Report can be built using the fields selected out of the Tables 


Data Blending using Multiple Data Sources

Tableau features to connect to multiple data sources to enable Data Blending for combining data from multiple data source types in a single worksheet.

Once multiple data sources are connected in worksheet, the Relationship between the data sources has to be defined on the Joining columns as shown below.


Once the Relationship is defined between data sources on joining columns, the Report can be created by dragging the fields from the defined data sources as shown in the below screenshot.

Fields highlighted with Blue border are dragged from a Text File Data source.

Fields highlighted in Green border are dragged from a ODBC data source created to fetch the data from OBIEE Subject Area.


Reuse of Data Source Creation:

Once a Data source of any type( File/Excel/Database Table/ODBC ) is created in Tableau, this Data source can be saved by using the option : Add to Saved Data Sources in Tableau as given below. This Data source can be reused in other workbooks if you want to reuse the same data source (same file/Table/model) again.


Save the Data source in a location. This Data source can be opened in other workbooks so that this can be reused.



  • OBIEE "nqquery.log"/OBIEE Cache doesn't capture the log for the reports you create from Tableau

  • Caching (Browser Cache) can be managed at Tableau level as all the reporting happens on Tableau


  • OBIEE and Tableau must have a common Authentication mechanism to pass the user credentials from OBIEE to Tableau (enabling Single Sign On).

  • Authentication is done at OBIEE. Data Level Security is managed/controlled within Tableau.

Key Features:

  • Amazing Data Visualization

  • Direct Database connection and report creation

  • Easy Dashboard Publishing and sharing

  • All general configurations like deriving calculative fields, report drill capabilities, hierarchies supported

  • Dashboard Export functionality for the selected worksheets

Key Considerations & Limitations:

  • OBIEE is not a trusted data source in Tableau Desktop. As a result, when using the generic ODBC data connection with an unsupported data source, success often varies and full compatibility with Tableau features cannot be guaranteed.

  • All the tables across OBIEE subject areas presented to the user on Tableau data sources tab.

  •Its recommended to expose the foreign keys in OBIEE RPD in Presentation layer in order to frame the joins between the selected presentation tables in Tableau. 

  • Select the subject area for which ODBC connection needs to be created.

   All the tables across OBIEE subject areas presented to the user on Tableau data sources tab. Key attributes of all the Dimension tables need to be exposed in the Fact Table in OBIEE RPD to configure joins in Tableau Data Sources. At the same time, Exposing Foreign Keys in the Presentation layer is not an OBIEE best practice. It's suggested to restrict the foreign keys in OBIEE RPD to Administrator, so that the Keys gets exposed in OBIEE RPD for usage in Tableau and these will not be exposed to OBIEE end user/Business users while performing adhoc reports.

  • Data Sources from External source (Files/Tables) can be saved in Tableau, so that it can be reused if the same data model is needed in the other workbook. In case it needs a data model other than that present in the saved data sources, then new connection has to be recreated every time.

February 26, 2018

Data Quality Overview

  • In this blog we are profitable to debate round the Data Quality and to dazed the subjects data laying-off, data repetition, Inconsonance data, castoffs data etc.

  • Every organisation are careworn for total data quality - to produce and maintain top ailment, information that is fit for its projected business resolution.

  • Below diagram gives an idea about the data quality flow.


Principally, EDI comprises of architectural apparatuses such as data sources, Functioning Data Store (ODS), dis-approval, and area database. The area database should be yielding with the trade average SID basis.

Data Profiling


Data profiling will inform about the ailment of the data and grounded on the data ailment we have to create rules and apply on the data.

In below table we can see some incongruity with the data which are emphasized in blue, red and green colors are portrays standard examples of data differences.

















Lexis Error






Domain Format Error









Veracity Constrait Violation





Missing Tuple

Data profiler supports data predictors to discovery data round a data factual .


Data which is having meager quality, unfinished and erroneous ca over come by doing data auditing.

A faultless data reviews detect key quality metrics, mislaid data, unbecoming values, spare records, and contradictions. When used in blend with Oracle Enterprise Data Quality Parsing and Calibration, it can deliver exceptional understanding of your data. 


In this part we will discuss how to rinse the data. Data cleansing task performs to remove replacements, alterations, inexact data from the font.

Data eradication is the scheme of pinpointing and fixing data irregularities by equating the sphere of values within the catalogue. Glitches are fixed routinely by ETL dispensation.


Data Quality apparatus finds that any data item "stands-out" (holds statistically momentous alteration from a mean residents), then the engine streamers it as an exception and stores it in the exception schema. Reliant on the category, omissions are communicated to:

1.  Data aspects to fix data irregularities at the source database.

2.  Eminence connoisseurs and business users to mount new superiority rules/remedial measures.

Quality Dimension








Schema Comformance


Vocabulary Errors

Format Error


Missing Value


Designates Straight fall of the eminence measurement


Designates that the manifestation of this variance baskets the detectio of extra aomalies reduction the quality extent.



Binding and its functions in Einstein Analytics

In this blog I will cover binding in Wave Analytics with a brief introduction on Wave Analytics.

This is an analytics tool which is developed for Salesforce and it is a cloud based analytics which connect data from different sources. It creates interactive views and shares the views in Apps. It is very useful for the business users who can check the changing trends in the market and take necessary actions.

Binding feature enables interaction between the reports in the dashboard as a prompt/filter in OBIEE. There are two types of bindings in wave analytics namely selection binding and result binding. By using these we can select different selection types by which the reports in dashboard changes as per the selection or by using result binding we can change the result of one step by triggering the other.

Before moving deep in to this we will discuss about faceting. By here we understood that the binding is used to create interactions between the widgets or simply the reports in a dashboard in wave. Faceting is simple and easy way to implement the interaction between widgets. It is an inbuilt feature in wave analytics.

We have an option in steps where we need to tick the checkbox, where it says "Apply filters from faceting". But why we go with the binding is, there is a limitation in faceting that we can't use this option with the dashboard contains lens(reports) from different datasets. So we go for binding.

PFB the screen shot showing the faceting check box for a widget:






There are two types of Binding, one is Selection and other is the result binding:

1.      Selection Binding:

The interaction in one step effects the other step, then it is known as selection binding.


In the below screen shot the two bar charts are Sum of Sales by Owner and Sum of Targets by Owner.

After applying the selection binding, once we click on any bar in the dashboard the other chart changes as per the owner selection. Here the selection binding is on the owner column.



2.      Result Binding:

If a step changes based on the result of another step, then that type of binding is known as result binding.


The below screen shows the result binding, here the bar chart changes as per the column in the toggle. By clicking on the top 5 button in the toggle we are resulting the bar graph to show top 5 average of amount by Account Industry.




Binding Functions:

We have three types of binding functions. They are:

v  Data Selection Functions,

v  Data Manipulation Functions, and

v  Data Serialization Functions.

By Using Binding functions, we can get the data from the step, manipulate it, serialize it and make it used by the target step. These functions can be used on data like scalars (0, 'This is Einstein analytics' or null), one and two dimensional arrays.

Binding functions are nested functions and must contain one data selection function, one serialization function and any number of manipulation functions.

For Example:


Here in the selection binding example when we click on any Owner in the first chart the second chart changes. As we send the Owner column as an object, when a particular owner is selected the dashboard changes as per that.


1.       Data Selection Function:


The source data is selected by selection function. It can either be a selection or result of a step which returns table of data. In which columns should have names and rows must contain an index (starts with 0). We can select columns or rows or a cell for binding.

When one row and one column is selected, it returns one dimensional array or if it selects multiple rows and columns then it returns two-dimensional array or if a cell is selected, then it returns a scalar value.


We have three types of selection functions, they are Cell Function, Row Function:


a.       Column Function. Cell Function: Returns a single cell of scalar data, where row index should be an integer and a column name should be a string and the cell should exist.



b.      Row Function: Returns one row of data as one dimensional array, or multiple rows which as two dimensional array.



c.       Column Function: Returns one column of data as one dimensional array, or multiple columns which as two dimensional array.




2.       Data Manipulation Function:

This function manipulates the data which is required as per the data serialization function. This function can be applied on the results of a selection function or even on the result of another manipulation function. The input can be null, if so, it will return a null value.

If there is no requirement for the data manipulation function, we can directly add a serialization function to the results of data selection function.

The data manipulation functions are as follows:

1.       Coalesce function: It is used to provide a default value when null value is returned.

2.       Concat Function: Concates data from multiple sources and returns as a one or two dimensional array.

3.       Flatten Function: It flattens a 2 dimensional array to a single dimensional array.

4.       Join Function: Converts one or two dimensional array into a string.

5.       Slice Function: From one dimensional array gives first and optionally the end position and returns the one dimensional array. It supports negative indices.

6.       toArray Funtion: This function basically converts the data to array, for example if the data is in scalar form it changes to one dimensional array or if it is in one dimensional array then it changes to a two dimensional array.

7.       valueAt Function: As the name suggests, it gives the value of a particular index which is requested for.


3.       Data Serialization Functions: This function converts the data as per the binding requirement.

The below are the different types of serialization functions:

1.       asDateRange(): this function returns date range filter..

2.       asEquality(): this function returns 'equals to' or 'contains in' filter.

3.       asGrouping(): this function is used to return a single or multiple groups.

4.       asObject(): this function returns data as object.

5.       asOrder(): this function returns sorting order.

6.       asProjection(): this function is used to project a field in step.

7.       asRange(): This function returns a range filter.

8.       asString(): this function returns a string.

SAS Model Risk Management

In this blog, will discuss about model risk and how model risk management is important for business needs and how beneficial by implementing the solution using SAS.

Here we will majorly focus in the finance sector and refer in terms of Financial Services business needs.

Decision making plays a very important role in any organization's business. In order to make the efficient decision making, building model which captures the specifications of these decisions is very crucial.

Hence the loss which is resulted by using inappropriate or inefficient model is nothing but Model Risk.

So nowadays Model Risk Management has captured plenty of attention among various risk managements for many of the financial institutions.

In order to mitigate the model risk, majority of financial institutions are in the process of developing Model Risk Management frameworks.

Not only financial institutions, but also the regulatory authorities are also keeping a tab on these frameworks by enforcing certain standards in order to bring model risk awareness to effectively identify and manage model risk along with governance of entire model life cycle.

Challenges in Model Risk Management:

1) Governance: In order to comply with the standards enforced by the regulatory authorities, the financial institutions should maintain a centralized repository which contains all the required documentation, reports, checklists, dashboards, codes and regulatory feedback.

2) Data Management: Financial institutions are supposed to submit various documents/reports/dashboards to the regulatory authorities at regular intervals as per the policy. In order to submit those documents, institutions has to collect and store historical data related to customers and facilities. Not only collecting and storing the data, they have to ensure the data quality which will comply with the standards as per regulatory authorities.

3) Monitoring and Validation: Last but not the least, monitoring and validation is very crucial in the model risk management. Ongoing monitoring against the market at periodical intervals is very important for any business. In order to avoid letting small issues snowball into larger issues later on, model validation at various stages by performing various model and factor level tests which capture stability, trend and robustness of the model is essential to identify the accurate risk.


Benefits of implementing solution using SAS to mitigate challenges of Model Risk Management:

1) Centralized model repository: Irrespective of the model type, technology, platform where the data resides, SAS will allow you to access the information thereby providing the end to end workflow management.

2) Data management: SAS allows you to collect and store the required historical data accurately and also makes sure that it is aggregated by identifying the relevant fields for the creation of the data marts or repository which will connect with the live systems offline and store data. This will effectively govern the entire data management for the model risk management.

3) Model monitoring and validation: SAS will enforce the appropriate testing on the platform for various model and factor levels along with adequate reviews in order to adapt the regulatory guidance on monitoring and validation.



OBIEE vs Tableau: Competitors or Complimentary?


OBIEE vs Tableau: Competitors or Complimentary?


It all started around 5 years back for me when I started to explore OBIEE for reporting.  Whole expertise that I had by then in OBIEE and I wasn't introduced to any BI reporting tool till then

10 months back, when for one of the client I was working for decided to opt for Tableau 10.3 with the assumption that they will start any new developments in Tableau even though OBIEE will remain live as it is, but gradually all the existing reports would be slowly migrated to Tableau, moving away from OBIEE (even though they will retain all the data model that is setup for Datawarehose that already exists).  We delivered pilot phase for this and this is still in progress but this made me interested to know (and ultimately train myself in Tableau) about what exactly this new reporting tool is, and I started to explore it via various learning modules available online as well as on COE of client. 

Initial Thought

I was under the impression that it would be just like BI Publisher and just to make reporting easier (which is also one of the goal), client wants to implement it over OBIEE.  But that didn't turn out to be the case. 

Pros and Cons

I won't go into technical details here but would like to do a pro versus cons analysis of OBIEE and Tableau from point of view of client, developer as well as a reporting user



Approximately 27 connectors including ODBC connectors.

70+ supported data sources via in built connectors along with Tableau Support.  Even more using ODBC connectors

Not easy to use for an organization or business users even if a minor change is needed immediately.  They will have to go via IT team

Once data model is ready, minor changes and even new derived columns can be easily published online within seconds

Any new table or data source can be connected as and when needed.

Not each and every data source can be brought into data model easily.  Custom SQL can be used but can't be tuned for performance improvement as Tableau server doesn't handle SQL performance of custom SQls

Has strong component of BMM layer and is a unique feature of this tool

All the circular joins should be handled in the physical data model only.  No scope (apart from data blending at the cost of ) of joins at the later stage other than data model

Map viewer is difficult to implement and manage.  Moreover needs a separate setup

Has inbuilt visualization of Maps with one click implementation.  Has ability to draw custom maps as well using geo spatial data

Has a few analytics functions viz. PERIODROLLING, AGO etc. but they do come at the cost of performance degradation.

Has very efficient Window functions that can help to use derived columns as a data source at run time of visualization

Oracle is now ending support of non-Oracle ETL tools which is a kind of restriction

Any ETL tool can be used

Out of the Box are valuable if you want to use one of the provided data model

No Out of the Box data model is provided

Total License Cost for Enterprise Edition is $51800 without any Support and user access (based on latest cost sheet from Oracle)

Cost for professional edition including Client and Server is $1260  per user per year including support(with yearly billing)

Product crashes with not so sure error messages

Quite stable tool and very rare issues.  Has very detailed error and exception handling

Can handle and manager large business models very easily

Large data models should be kept offline manually unless published online

No predictive analytics

Has inbuilt capability of predictive analytics and integration to R language


As it is quite clear about the advantages of Tableau over OBIEE, we can conclude the reason why Tableau took over OBIEE in last 5 consecutive years as a leader.


Image Source:



Even though both the tools are being compared, but if a new organization is about to opt for a BI tool, it should start from Tableau and can outline its BI needs.  If it feels that business needs are too large to be handled by Tableau, it can bring in OBIEE as well.  Both the tools can co-exist as well and would be a great advantage to the organization.  Business users do feel Tableau to be an easy to use tool (where ad hoc reporting can be done very easily using offline files as well).

However Tableau has an added advantage of Predictive Analysis and can connect to powerful R Studio that isn't even a feature in OBIEE.  Oracle has it but as a part of separate predictive analysis tool which is a costly affair.

We can't say that both the tools are competitors but do complement each other in many ways.  While we don't need too much of technical know-how about Tableau architecture to work on it, OBIEE does need support from IT team that is well aware about the architecture which makes Tableau a better solution where most of the users are not too familiar to use OBIEE.

February 19, 2018

Chatting with Bots - More necessity than a science fiction

In the age where there are multiple applications involved in supply chain process, the knowledge about the customer orders is distributed. It has become a walk on the tight rope to keep the customer updated about the process of their order Vs cost to provide the information to the customer via customer service team or a complex BI solution. This blog opens a possibility of cost effective and light weight solution by introducing the 'Chatbot'.

The IT landscape involve multiple applications to fulfil every single order due to the nature of business, way the organization have evolved, number of business entities involved or due to the speciality of the applications. Below is the example of a manufacturing and retail organization

Pic 1 - Typical IT landscape

In this complex matrix, the traditional methods to keep the customer updated about the progress of their orders are as follows

  • Send text message or email about the status
  • Set up a customer service team to handle customer requests via call, text, email or chat

But the drawback of these conventional methods are that there is no single system which holds the moment of truth about every order. In order to avoid the customer service team juggling between applications, a complex BI reports are installed to oversee all applications resulting in even more complex IT landscape.

Alternative solution is that 'Chatbot'. According to Wikipedia, a chatbot is a computer program which conducts a conversation via auditory or textual methods. Customers can chat with Chatbot to get the information about their orders. Let's see why the Chatbot solution is cool.

Implementing the Chatbot:


There are 2 main functionalities of Chatbots:

  • Receive and understand what the customer is saying, and
  • Retrieve the Customer information required

 In order to receive and understanding what customer is saying via chat, Chatbot uses Natural Language processing systems. Via artificial intelligence and machine learning, Chatbot is trained to understand the customer's request better. There are numerous cloud based chatbot development platforms can be leveraged to design, build and train the Chatbots. Oracle Cloud Platform or IBM Watson are examples of such Platform as a Service (PAAS)  solutions available.         

Text Box:  
Pic 3 - Example of a chat conversation in mobile
For retrieving the information required, the Chatbot uses web services to connect with each application. For example Order management Cloud has an Order Import Web service which can be involved by using the retail order number. Similar order information web service can be created. The Chatbot will have to invoke the web service and find out the best status of all the application and publish it to the customer.

Via these NLP and web services, implementing a Chatbot solution is easier than ever.

These Chatbots are not too bulky and intrusive like traditional BI solutions. They occupy less space in server or can be easily placed in Cloud as well.

Customer Experience:

Customer Experience, in short CX, is a major focus area for the organizations. With referral customers giving more business than new customers, the organization want the customer to be handled with care. The Chatbot will give the customers an unparalleled experience just like chatting with a human.

The Chatbot can chat in different language as preferred by the customer. In addition, Chatbot can be trained to reply on text or voice commends as well.

The Chatbot can be used on computer, tab or even mobile to give customer an excellent convenience.

Capex, What Capex?

 Setting up a multi-language enabled customer service team 24 x 7 or implementing a complex BI solution is far more costly for the organization. The cost and time to implement a Chatbot is far less when compared to the traditional methods. Readymade Chatbots are available which are already designed and built to a general extend. The implementation will be limited to involve the order information web services from various application and to train the Chatbots.


The Chatbots can also be used for expediting an order if customer requires. Chatbot can send mails to the Production team in manufacturing facility with the chat history to ensure that the order is expedited.

With the technical advancements, Chatbots are even helping patients who suffer from Alzheimer's disease and insomnia.

To summarize, Chatbots are easy, simple and light weight applications that solve the major problem of keeping the customer engaged. So if you are chatting on a web site to know the status of your order, you may be chatting with a robot already!!!

February 18, 2018

Bring the power of Excel to Oracle EBS with Web ADI

Very often we come across business users keeping an Excel sheet beside them and keying in the data into Oracle EBS. If you ask them about automating the process, most often the alternate option that they are  aware of is sharing the file in a predefined csv format to IT support and having them upload it. But what they don't like here is that, any errors will have to be shared with them by IT support and the time and effort involved in the initial upload and the error correction is significant. They feel the effort involved is not really worth it, and they are better off keying this data manually! It is very surprising to see how technology such as Web ADI (Oracle Web Applications Desktop Integrator) is not more commonly used to automate such manual data entry when you already have the data in an Excel or other documents. Most of the time customers hire temps or interns to key in this high volume data into the application.

Without going deep into the architecture of Web ADI, I will put Web ADI as an EBS capability that enables us to create Excel sheets that will automatically connect to EBS and perform the function that it is configured to do. When Web ADI is implemented, the Excel downloaded from EBS will have a new menu for Oracle.


In my experience so far, whenever we have demonstrated the capabilities of Web ADI, the customer response has been enthusiastic . The features that excite them the most -

  • You don't have to login to EBS or navigate/download the Excel every time. You download the Excel once; you can access it by just opening it from desktop. If EBS SSO (Single Sign On) is enabled it automatically understands who you are, if not, a pop from Excel, requests you to login.
  •  You have got all the features of Excel at your disposal - You can drag a cell, copy-paste data, use formulas to derive data, filter for certain values, have your own macros.
  •  You have the option to selectively upload only certain rows from the Excel.
  •  If the solution involves launching a concurrent program in the background, you can monitor the status from the excel itself, without having to login to EBS.
  • Business validations can be built into the Excel to show any issues to the user in the Excel itself. The user can chose to correct it or not upload the error record and proceed with others.
  • Can work with the Excel offline. Of course, the user needs to be connected to validate or upload the data
  • The Excel can be configured for List of Values (LoV), double clicking on the Excel cell will open a HTML page that will give the capability  search for valid values and select it.
  • The excel sheet can also be configured to have drop down values.
  • When a user wants a report in an Excel periodically, we can have the Web ADI configured for reporting only. The user just needs to open the Excel from desktop and refresh

Web ADI being a module within Oracle EBS, it inherits all the security framework of EBS. When the user is downloading or uploading data from Excel, they are doing it as an EBS user from a valid responsibility. User and responsibility specific restrictions can be implemented. For example, data can only be validated by a user/responsibility and uploaded by a different user/responsibility.

There are of course some areas where Web ADI is not a good option. Web ADI is only for user interface (UI) with one master-child relationship, with one header and optionally multiple child records. If we are looking for data upload with the UI having more than one master-child, it may not be fit for purpose. For example, Web ADI is good to mass update lines for a specific Purchasing Contract, but not fit for a requirement where we need to update the lines, shipments and distributions of a Purchase Order. While these features can be technically implemented, the solution may not be essentially user friendly. Moreover, Web ADI should not be seen as an alternate UI or technology for a Forms or OAF based UI, since it is not possible to dynamically enable/disable columns, show additional info/warning/error message at user key strokes at field level or move control dynamically to different fields in the Excel.

That said, here are the best opportunities where we can use Web ADI effectively

  • Where data needs to created/updated in bulk, where the data may already be there in Excel, etc.
  • Where we need the capability to work with the data offline and sync it up with server when online.
  • Where we may have to review a set of data and correct it - Web ADI can download the data that qualifies the condition into Excel. The user can make the changes and upload back.
  • Existing concurrent programs based solutions where a flat file is shared with the IT support team and the support team uploads it and shares the error records back, on which corrections are made and shared again to the support team for upload.
  • Reports that you want to refresh frequently.

Excel being Excel, most users will be comfortable with it and will not need any special training to use it. Web ADI being a technology module within EBS does not need any additional license. If you already have EBS, you can apply the required patches and start using it. If you look at it the other way, it is a module that the customer has already paid for, and not using it! Identifying the opportunities to implement the Web ADI based Excel solution can be a very good idea, that everyone on Oracle EBS should consider exploring for a better Return on Investment.

February 9, 2018

Getting Started with People soft Campus Solution

Overview on PeopleSoft Application

·         PeopleSoft is a comprehensive, multi-tasking ERP system

·         Various domains e.g. FSCM, HRMS, Campus solutions are integrated in the product

·         Flexibility in decision- making is enhanced due to integration of data in between different domains in PeopleSoft

·         Reporting can be in summarized and detailed analysis depending upon the requirement from single domain or consolidation of two or more domains

·         Robust facility for audit trail enables management to cater for user login transactions creating accountability across organizations

·         Reports and data can be synchronized for different accounting periods

Campus Solutions (CS 9.0, 9.1,9.2) Business Processes:

·         Student Management:


o   Recruit Prospects

o   Admin Students

o   Enroll in Classes

o   Advise & Transition Students

o   Manage Financial Services

o   Manage campus services

·         Academic Management

o   Schedule courses & Resources

o   Plan Academic Programs

o   Teach courses

·         Institutional Advancement

·         Campus Community

·         Personal Information Management

·         Maintaining Biography/Demographic/ Health Data

·         Maintaining Health Data

·         Maintaining Identification Data

·         Maintaining Participation Data

·         Organization Data Management

·         Maintaining Organization Data

·         Maintaining Event Data

·         Maintaining Committees

·         The 3Cs

o   Communications

o   Checklists

o   Comments

Below mentioned aspects are the main module in PeopleSoft Campus solution. These modules will be further divided into various subject area while creating analytical reports in OBIA. This getting started document will provide the basic approach or functional idea to start with Campus Solution and build new subject area and data model by customizing the exiting OOTB (OBIA) application in both ODI and OBIEE.


PeopleSoft Campus Community: All the data of those people, who are connected with the University. Basic information storage for those Personnel. People who are interested in the university.

·         Personal Information Management

·         Organization Data Management

·         Service Indicators: This can be uses as flag to determine & refrain individuals or organizations to get certain services or indication by proving positive or negative service indicators (e.g. student having attendance < 75%, will not be able to see his/her grades).

Recruiting & Admission:

It administers the admission process by managing recruits and tacking prospects and applicants. It includes the following: (Prospects - the person who come to university for admission, and become a student of the university)

·         Capture information about the prospective student

·         Maintain multiple application for individual, people soft allows to take admission for both UG and PG

·         Tailor admission requirement and processing for each & every academic programs.

·         Update/modify prospects status without manual intervention based on a program's individuals defined criteria

·         Automate evaluations and updates of admission decision

·         Automate evaluation

·         Eligibility Check, Biographical Details, Regional, Prospect Career Data (Program - UG/PG), Prospect Schooling, test Details, Query from Prospects

Student Records & Enrollment

Student Records enables you to enter, track, and processes all the academic information for the students. The major features of student records are:

    •         Enroll student - enroll
    •         Student term information - 1st term. 2nd term
    •         Career and Program information - ug /pg
    •         Transcripts - mark sheet
    •         Student background information -
    •         Advise Notes - teacher can provide advise notes
    •         Extracurricular Activates - extracurricular activities

Below are the details about some important aspects of Student Records & Enrollment. We have discussed these components in details because Student Records is one of the most important subject to be used in OBIA.

    •            Repeat Checking: It will allow the teacher or administrator to manage students' repetitive coursework.

·         Course Catalog: allow the supervisor/advisor to arrange courses for the prospects.

·         Enrollment Requisites: Allow supervisor arrange necessary groups & course lists.

·         Schedule of Classes: Allows supervisor to set up classes, search for amenities/facilities, and enroll the schedule from semester to semester.

·         Instructor Workload: Allows supervisor to modify, track, and report workload hours for individuals.

·         Program Activation and Management: Allows supervisor to enable students into academic programs and manage their academic program, plan, and sub plan data.

·         Batch Term Activation: Allows Enables you to activate groups of students into terms.

·         Quick Activation: Enables you to activate students into academic programs, bypassing the Activate Applications matriculation process (ABPCPPRC) in PeopleSoft Recruiting and Admissions.

·          Academic Statistics Consolidation and Reporting: allows you to prepare the system to consolidate academic statistics for students, to run processes that consolidate academic statistics, to make use of the consolidated statistics after processing them.

(CAN) Canadian Government Reporting: Enables users with an installation country of Canada to generate reports for federal and provincial agencies.

Curriculum management:

·         Scheduling courses and classes, -- which dates of class, timings, discussion components (creation of courses)

·         View and update class sections

·         Attendance roster by class - present /absent /late (class attendance)

  1.             Attendance roster by student
  2.             Defining course assignments - surprise test
  3.              Instructor/advisor information - teacher

Student Financial Module:

Student financial module in Campus solution of PeopleSoft ERP will deal with the education Fees structure of the student in any institution. This module will be holding account information for every student according to their course, term & year. In some cases, there may some exception where student is part courses which belong free course category. These kind of records may not be found in this module. This module in consists of different sub module with in itself, which are as follows:


    •         Account type (S/F Account Type)
    •         Item Type (Charge/payment/Deposit/ Concession)
    •         Criteria(UG/PG)
    •         Term Fee (75000)
    •        Tuition Group (B. Tech 2018)
    •        Target (The account, the whole money is going)
    •         Tender (Cash, Demand Draft)
  • Student Financial Aid

Financial aid business process helps to automate financial aid processing for federal and institutional with further efficiency. It also allows the administrator to manage the aspects of financial aid activity for applicants and prospects with much flexibility.

With this application, you can:

    •         This module will help intuition to receive and keep track of financial aid application in more intuitive way
    •         Design cost of attendance assessment by defining budget categories, items, and formulas.
    •         Enable institution to Manage loan processing, packaging and distribution based upon the need analysis
    •         Process and calculate different grants and awards, e.g. Pell award
    •         Provide financial aid to needy and eligible student
    •         Modify/update the letter of financial aid
    •         Maintain various federal compliance regarding direct loan and other aspects
    •         Enable student to take control of accepting, declining or viewing award awarded to them
    •         Enable the admin to provide opportunity for work-study programs culture to student

Gradebook Business Processes

Below mentioned components are the main aspects of Gradebook processing:

    •         Entry of the class Assignment for each and every classes for a particular department.
    •         Allow to enter grade for every assignment to a Prospect
    •         Allow the supervisor to export the grade for a particular class for a specific assignment
    •         Enable to supervisor/advisor to review any assignment
    •         Provides the ability to review the grade assigned to any prospect

February 7, 2018

Data Visualization in OBIEE 12c


Data visualization is the presentation of data in a pictorial or a graphical format which takes care of a complex problem that could be easily overlooked and makes things easier using graphs, patterns, and design. It is useful in converting real-time data into rich reports and visualizations. Effective data visualization should be informative, efficient, appealing, and in some cases interactive and predictive.

It enables us to see analytics presented visually which helps us to grasp difficult concepts / identify new patterns easily. With interactive visualization below are the possibilities: drill down into graphic representation and diagrams for more detail; interactively altering the data and how it's processed.

Data visualization also has benefits:

1. Absorb information in new and more constructive ways.

2. Visualize relationships and patterns between operational and business activities

3. Identify and act on emerging trends faster

4. Manipulate and interact directly with data

5. Foster a new business language

In the sections below, we would be explaining how it adds value to Organization.

How it can be used?

Advanced data visualizations support more in-depth and complex analytics. A visual tier that sits on top of the analytics program allows users to view the results of complex algorithmic processing. Not only can they get insight into what's happened, they can forecast what might happen, using rich graphics to quickly derive business actions. The fact is, when people transition from spreadsheets to data visuals, they are able to register the values they are seeing as a whole.

Consider the below table showing traditional reporting:


Now consider the ability to visualize every line above like below. A simple visualization can make the boring data come alive:


One is about putting the emphasis on the data. The other is about the shape and patterns of the data requiring our sense of sight. We need both for better decision making.

What is the Scope of Data Visualization in OBIEE?

The basis of OBIEE has supported BI Cloud service for a long time and with the arrival of data visualization offered in OBIEE 12c, it is at the same level with other bigger tools such as Tableau and Qlick View. The tools work with the support of various cutting-edge techniques, for example, overlapping, brushing, outliner selection and other developments.



obiee4.JPG1. Easy to Use- OBIEE 12c is reasonably easy to use, users can examine data by either overlapping it on a geographical map or by decoding the dashboard visuals by choosing outliners.

2. Uploading Mash Up- OBIEE 12c comes with a technique called data mash-up which makes the task of uploading the data online and creating the report easier. Users can self-perform tasks like uploading files and data sets or building ETL process without taking support from IT professionals. 

3. Simple export- In OBIEE 12c RPD is presented without changing the target system connection pool details, so the export and migration of data sources don't need the connection to the pool folder data.

4. Additional Features

·         Data visualization allows the users to use the results from current OBIEE reports.

·         Data can be sourced from external databases.

·         Users can edit columns, create new calculations, etc. on the uploaded data.

·         Upload limit of the data file has been increased to 50MB.

·         Color Schemes can be modified which and has the choice to change the entire dashboard or only change individual visualizations.

·         Allows users to add text boxes to their data and comments on other users.

·         The Presentation Mode permits to insert a Data Visualization project into an OBIEE Dashboards and hides the editor components since projects can be made read-only which let the report to function almost like a "classic" OBIEE dashboard reports.


Sathya Bhama & Smriti

February 1, 2018

Introduction to Oracle Analytics Cloud - OAC

In this blog, I will cover what is OAC and Why OAC. In my next, will deep dive into features and stuff.

OAC is Oracle Analytics Cloud, a PaaS service offered by Oracle. Its launched in 2014. Around 1400+ customers are already using as of May 2017.

As we all know BICS is already in market and it is offered under SaaS by Oracle. So what's new and why we need OAC is what we understand next in this blog.

BICS is an Oracle managed and OAC is Customer/Partner managed - major change in cloud offerings by Oracle.

Below are three offerings by Oracle in OAC

1)       Standard Edition (Data Visualization + Essbase)

2)       Data Lake Edition (Standard + Big Data Source)

3)       Enterprise Edition (Data Lake + Essbase Enterprise)



Data Lake


DataVisualizationCloudService - DVCS




Oracle Smart View




Oracle Essbase Standard




Big Data Source




Data Modeling




Oracle Essbase Enterprise




Oracle Day by Day





Now we know various offering in OAC, lets understand difference between BICS and OAC

Difference between BICS and OAC


BICS and DVCS today


Automatic Backup






Patch and Upgrade



CPU & Memory


Client/Partner can decide and control

Server Access & Config


Client/Partner can access through SSH

Network & Security Config

Limited Support

Client/Partner will have complete control to access network

DB Dependency

Have to buy Schema as Service (BIS50)

Customer needs to provide at least 10 CPU DBaaS

IaaS Dependency


Client is required to buy some IaaS storage


Various sources supported by OAC

SaaS/PaaS ->


Database - >


BIG Data - >

OAC Fig3_new.JPG

Generic ->


To get start with OAC instance set up, we need below as prerequisite

OSCS - Oracle Storage Cloud Service - Backup, restore, archive

Block Storage - Oracle IaaS service offering for flexible and scalable compute and block storage

DBCS -Data Base Cloud Service - DbaaS service offered by Oracle

Why OAC? What's new in OAC?    

OAC Fig5_new1.JPG

Will deep dive into features and advances features in my next blogs on OAC. Thanks.

Subscribe to this blog's feed

Follow us on

Blogger Profiles

Infosys on Twitter