Infosys’ blog on industry solutions, trends, business process transformation and global implementation in Oracle.

« August 2017 | Main | October 2017 »

September 30, 2017

Digital HR - Redefining the traditional HR for future !

 

'Digital Organization', 'Going Digital', 'Digital Transformation' - these emerging terms are still being defined in our times. We are also witnessing the roles like Chief Digital Officers on-boarded in an Enterprise to drive the Digital Transformation!

To most enterprises, 'Digital' could just mean the changes / disruptions due to IoT, Social, Mobile, and Analytics, Cloud or Other associated technology innovations, depending on what they want to achieve with Digital.

The key question remains the same - do you call an enterprise or organization 'Digital Organization' just because they rolled out these technology changes OR Is there a bigger story - bigger impact - much beyond than these seemingly technology advances ?

No one seem to have leveraged enough from of these disruptive ideas. No one seem to have understood the completeness, either.

Peeling the layers of technology components associated with a 'being Digital' campaign, Enterprises clearly experience that the 'Digital Transformation' is impacting much beyond mere technology advances or changing the user / employee experience mechanisms of the enterprise. The impact of Digital starts all the way from the Organization Strategy, Operating processes and ultimately trickles its way to core of any Enterprise - People.  

Why Digital HR?

We believe that Digital is rapidly changing the definitions of work breakdown, workforce and work location for an Enterprise.

Here are some instances which corroborates this hypothesis :

HR function in an Enterprise gets impacted by the 'Social' paradigm of Digital in building Social platforms for collaborative work, Sourcing of suitable Talent, Learning using collaborative channels, encourages knowledge sharing using blogs and build social persona.

'Enterprise Mobility' allows the workforce to connect to work from anywhere, anytime thus improving the employee experience. However, this is redefining the term 'work location' which is completely virtual now as compared to 'nn-seater office' of olden days.

Internet Of Things creates a Connected Enterprise bringing in efficiencies through Automation and usage of Artificial Intelligence support the Workforce which can focus on more value-added work in an Enterprise.

Modern SaaS / Cloud applications bring user friendly business processes and improve the productivity of the workforce. They are changing the technology landscape of the Enterprise for future. These platforms are typically offered on OpEx license models and bring in lower TCO.  

Enterprises in Digital era thrive in making data-driven decisions and no longer should look for 'Reports' or 'Dashboards'. Data has a varied meaning in future and Enterprises will need predictive and prescriptive HR Analytics to handle the workforce of future.

All these paradigms impact the basic business of any Enterprise - 'Work'. Increasingly, Enterprise have to focus on acquiring skills for future, refactor the existing talent as part of their Workforce planning to handle the new Digital workplace.

Hence, there is an inseparable relation between these Digital advances and 'HR function' in an Enterprise as these changes 'Transform' the  work, workforce, workplace and finally the very operating model of the HR function in an Enterprise, as shown in the picture below :

Consequently, it is imperative that HR function in any Enterprise needs to embark onto the 'Digital' journey first, more than any other group in the Enterprise, to steer ahead into Digital Era.

Our take on building a 'Digital HR' Organization / Enterprise

We believe that Enterprises should focus on Modernizing their HR function by taking several initiatives which shall align to the Enterprise level Digital strategy.

Here are the key HR initiatives which Enterprises need to undertake as part of HR Digital Transformation, to truly transform themselves into 'Digital HR'.

  • HR Process Digitization - removing the paper-based HR processes in every corner of the organization, replacing with systemic HR processes / transactions.

  • Integration of Information Islands - ensuring that no part of the HR landscape is isolated and has redundant information. This also ensures that there is single source of truth for HR data.

  • Globalization and Internalization - to ensure the global deployments and compliance

  • Information Security - ensuring compliance to data privacy laws and PII data protection across the HR application

  • Data Driven Decision Making - Build HR Analytics for future to provide prescriptive and predictive Analytics to enable data driven decision making and capture of HR KPIs.

  • Enhance Employee Experience - by building the Employer Branding and introducing mobility, collaborative work culture, differentiated career planning, innovative CnB packages and learning beyond the current.

  • Onboarding to SaaS / Cloud platforms - will ensure leveraging the HCM Cloud solutions offered with best practice based approach and flexibility in the total cost framework.

During this Oracle Open World 2017, join our Infosys session that offers firsthand knowledge and advice about Digital Organization & Digital HR that practitioners can take back and implement within their organization covering our Point of View and Offerings in all these areas.

September 29, 2017

Emerging Trend in SSHR- PART2

Latest Trend in Self-Service Technology

1.      HR in Cloud

When everything in the world is going the Cloud-way, Self- service application is also not behind. This is accessible everywhere with availability of network. With one-click, employees have the flexibility to access their data and own it.

2.      HR in Mobile/Tablet

Personal use of smartphones is growing exponentially and there is an ever-greater use of tablets by consumers. Consumer-driven use of these tech devices is shaping up as a benefit for businesses, including those in charge of human resource management. More companies are seeing value in allowing their employees to access HR & payroll data from their smartphones, tablets, etc.

3.    App driven technology

With an ever-expanding Mobile workforce, there is a business value in giving Mobile access and App driven preference in Mobile for major HR key functions like Absence, Talent, Training, Separation management, etc. There are many market players who are delivering mobile based HR app (Applaud system, Kronos, Sage, Workday, etc)

Key Benefits with Latest HR Edition

1.      New User Interface

The new HRMS Self-service feature should give a cleaner look and feel, with responsive features that gives more space for controlling the application with your fingers on a tablet while saving space on a desktop. Below given are some of the new features in latest addition:-

  • New skin
  • Rich table interactions
  • Simplified home page
  • Support for gestures on tablets

2.      Managing multiple user-types

System should give enough flexibility to different type of users to use this application depending on their role, what data they need access for, what should be the data access set, etc. Broadly, we can classify users into given categories:-

1.       Business Users

2.       Administrative users (HR Users)

3.       Designers /Consultants (Engineers)


emerging trend in SSHR Part 2.PNGHow Business users are benefitted

Business Users mainly consists of employees who mainly need access  to their own data. Employees may need to view or update some data or raise some requests End-users should have accessibility of data anytime and anywhere. They should be up-to date with HR policies and procedures. When users get answers to their questions they have about their pay or benefits in an instant,  they feel more empowered, more in control - and more content at work.

 

How Administrative users are benefitted

Administrative Users may include Line Managers/HR users/ HR Administrators who will need access to control data of their subordinates. A manager may have many subordinates for which he would like to view complete details, initiate certain request like Transfer, etc. Also, in some cases, manager can raise certain requests on behalf of their subordinates in their absence like Absence Request, etc. Usage of Self- service focusses on maintaining a healthy and a happy workforce. It increases efficiency and empowers managers and encourages autonomy.

Another set of Administrative Usera are HR Administrators who are dedicated for a particular unit, division and  they require access for employees under that particular division, unit, etc. to submit transactions on their behalf. One of the majorproblems faced by HR departments is that end of month and end of year processes can create a huge admin bottleneck. And this requires additional HR staff. By balancing out the administrative tasks across your organization, you can maintain a more manageable head count across your HR department.

This administrator basically has the ability to perform selected operations for a subset of authorized users within the domain of particular unit. Approvers who are actually approving particular transactions are also part of Administrative users. These approvers will get notified for transactions as per requirements configured & upon successful application only, data gets updated in core HRMS system. Data Management is definitely simplified, secured and efficient using Self- service application

 

How IT Consultants are benefitted

Third set of users are the consultants/designers who are actually designing the application so they will need the access of complete configuration & system administrator kind of role to actually control all the actions configured.

Many market players including Oracle HCM Cloud Self-service latest edition provides all these benefits to the ultimate users.


Multi-Language Implementation in Oracle BICS

To enhance user experience for a global customer in Oracle Business Intelligence Cloud Service (BICS) reporting it is a common requirement to display the metadata or data in multiple languages. Implementation is different between On premise BI and BICS to achieve multi-language. In next few steps will explain how to achieve this BICS

BICS is similar to OBIEE on-premise but is a thin client interface and lacks some of the more advanced features. BICS only provides the BI Components in multiple languages and hasn't released any support yet for multi-lingual behavior for metadata, presentation catalog components.

 

Below approach would actually help when implementation is pure data modeler approach and not a lift and shift.

 

Step#1: Out of the box UI display

These comprise of the basic BICS UI and is an out of the box functionality available in BICS. This can be implemented by simply changing the Language setting in My Account to the user's preferred language.

 Impage1.png


 This results in the OBIEE components and navigational tag names translated as shown below (Language has been changed to French in this example)


 Impage2.png

Step#2: Data in dimensional columns

Language selection in My Accounts Tab for a user would be captured in session variable 'WEBLANGUAGE'. It stores the language code of the selected language by the user and few are below examples.

Below are values of this variable for a few languages.

For English, WEBLANGUAGE='en'

For French, WEBLANGUAGE='fr'

For Spanish, WEBLANGUAGE='es'

This session variable can further be used to identify the user's selected language preference and accordingly modify the user's display dynamically.

We should get all dimensional column data in different languages we would like to display.

For example the below sample table where ColumnA_EN has values in English and the next two columns ColumnA_FR and ColumnA_ES are French and Spanish respectively:

 

ColumnA_EN

ColumnA_FR

ColumnA_ES

Technology Services

Services technologiques

Servicios de Tecnología

Corporate Solutions


Solutions d'entreprise


Soluciones Corporativas

 

Once we have data in above format,

In the BICS model create dynamic columns which will pick the column from the table based on the user's language selection in their My Account Tab.

Use the below column formula to dynamically pick up the translated data column based on the value of the session variable.

INDEXCOL( CASE  VALUEOF(NQ_SESSION.WEBLANGUAGE) WHEN 'en' THEN 0 WHEN 'fr' THEN 1 ELSE 2 END , ColumnA_EN, ColumnA_FR ColumnA_ES)

 


 

Step#3: Metadata

Metadata display in multi-language means displaying the table and column headings in the Presentation layer in multiple languages.

 

Create a table structure as below with names (columns, prompts, titles ..etc) in English and respective translations as below

 

MULTI_LANG_TABLE

Base_C

Translated_Name

Lang_Code

City

City

En

City

Ville

Fr

City

ciudad

Es

 

Each metadata column name to be translated (in this example, City) will have 3 rows (for 3 languages) in the table with different Translated_Name and Lang_Code

 

Corresponding to each metadata component that requires translation (column names, prompt names, etc.) create a session variable as below based on the Table in the Database.

SV_L_BASELINE:

SELECT TRANSLATED_NAME FROM MULTI_LANG_TABLE WHERE UPPER(LANG_CODE)= UPPER('VALUEOF(NQ_SESSION.WEBLANGUAGE)')

 and upper(BASE_C)=UPPER('City')

 

Number of Columns = Number of Session Variables

 

In the Column headings for columns in analyses or prompts use the below syntax to capture the value of the session variable for that column

Impage3.png

 

Individual dashboards where session variables were used in columns, prompts, titles, etc. will display in the language of preference as per the user. The presentation layer itself in the Analysis Criteria would however still present itself in the language in which it was created (in our case English).

Above 3 steps will help achieve the multi-language implementation in BICS.

September 28, 2017

Oracle Analytics Cloud (OAC): Comprehensive Analytics platform on cloud with more control

As we are eagerly waiting for this year's Oracle Open World for new announcement in Analytics space, it is worth to recap what has happened in the last year.  As per its pledge during last open world Oracle released its new generation comprehensive PaaS version (Platform as a Service) of Analytics with options to configure and manage the instance by customer. I got a chance to build a POC for one of our customers using Oracle Analytics cloud and would like to share my first-hand experience through this blog.

Product is available in below two versions with mentioned components in it.

Standard Edition:  Data Visualization + Essbase (Standard Edition) + Data Visualization desktop

2)     Enterprise Edition: Data Visualization + Modelling & Enterprise Reporting (BICS) + Essbase (Enterprise Edition) + Data Visualization desktop + Day by Day mobile App

OAC_Schematic.jpg

       Fig1: Schematic diagram of Oracle Analytics Cloud (OAC) 

As we can see from the product offering options, OAC is like an OBIEE 12c version available on cloud with Data Visualization + Essbase with enhanced features. Standard version is more suitable for Departmental Data Analysis /visualization needs whereas Enterprise version is a complete package to meet Organization wide Analytics needs. I am sharing key differentiating features here, which I feel more appealing for the existing OBIEE customers and for the prospects.

Control over configuration and maintenance

OAC provides complete control over the sizing of their cloud environment and as well as the maintenance part of it.  OAC licensing is unbundled compared to "Business Intelligence Cloud Service" or "Data Visualization Cloud Service", customer need to buy a separate IaaS (Infrastructure as a Service) and Oracle Database cloud service. It means, customers can choose the number of OCPUs-RAM combination depending on their requirement.

We can now connect to server through SSH and perform required configuration file changes as similar to on-premises environment.  We can also decide when to apply patches and carry out maintenance activity.

Complete Application Life cycle management activities such as starting/stopping, provisioning of services, backup can be managed either through front end or via REST API based scripting. We can control the OCPUs allocation to OBIEE, DVCS and Essbase.

Customers can bring their own on-premises DB license and host data warehouse on cloud using Oracle Database cloud or a buy a new license. Most fascinating feature of OAC is that it can connect to on-premises database through VPN connection.So one can get the benefits of cloud Analytics without any need for changing their ETL/ Data warehouse architecture. 

Flexible Pricing

As stated earlier, to run OAC, customers need to buy additional IaaS and database license as per their convenience. Licensing of OAC platform itself is based on number of OCPU-RAM combination instead of user based licensing as in the case of SaaS (Software as a Service) cloud products. Customers can thus provide access to unlimited number of users similar to on-premises environment with appropriate IaaS and Database sizing (for DVD there is a cap of 50 users/OCPU).

OAC can be subscribed either through metered or as non-metered OCPU license. Metered license itself has got Monthly/Annual hour rates or Pay as You Go option. Customers can now run their environment when it is actually needed and turn it off when it is not. Non metered license can be bought on monthly subscription rates, and it is more suited for global organizations working across multiple time zones. OAC thus provides flexibility to scale up/descale the capacity as per customer's need with no lead-time.  For more details, please refer Oracle web site

Essbase on cloud

OAC does come with Essbase cloud component which is mostly tailored for what-if or MOLAP analysis kind of needs rather than like a standalone, full-fledged on premises Essbase Application. It has a cube designer Excel add-in along with pre-defined Excel workbook templates enabling single step cube creation and easy data loading. One will still miss the control and management option available through Essbase Admin Console (EAS).  We need to define data load rules in the templates and there is no other way to change them. Just to assure Smartview Excel add-in is still available for reporting on cubes.  I strongly feel Oracle will enhance this service in upcoming OAC releases.

Easy Data Modelling and Deployment

We can either use BICS data modeler for simple data modelling needs or we can use OBIEE12c Admin Tool to build and manage more complex RPD models. Access to RPD through Admin Tool provides full-fledged design capabilities similar to that of on premises environment.

Customers can easily archive catalog, Security Model and RPD in the BAR (BI Application Archive) file and migrate to another environment in a single click. Similarly the RPD can be deployed from one environment to another. OAC thus provides seamless migration capability between cloud instances and also from on-premises to cloud or vice versa.

Ability to prepare data by end users easily

OAC offers enhanced out of the box connectors to 25+ on premises and cloud databases /SaaS applications in DVCS. It is more helpful in case of SaaS applications as it provides direct access to business entities of those applications (for e.g. Incidents, Organization in RightNow/Service Cloud).We can expect more new additions to the current list in upcoming releases.


Datasources_Dataflow.jpgFig2: OOB Data souces and Data flow in Oracle Analytics Cloud (OAC)

User can also create light ETL operation such as joining data sets, concatenating columns, aggregations etc. in DVCS. It will greatly help end users in their data preparation needs reducing dependency on IT.

Extensive Visualization and Advanced Analytics Capability

OAC comes with added list of pallets and charts such as Sankey, Chord Diagram and Network in DVCS. Users also get single click access to add advanced analytics features such as Clusters, Outliers, Reference, Trend Lines and Forecasts directly on the graph/charts.


Newcharts.jpg

Fig3: Visualizations available on OAC

Most promising feature of OAC is availability of Advanced Analytics functions such as Regression, Cluster, Outlier and Trend Line as out of the box for creating calculated fields. We can now embed custom R Script or Python code leveraging Evaluate_Script function, which opens gateway to machine learning, advanced statistical modelling and text mining.

Missing dots

I observed few important gaps in current version of OAC which temporarily stops it to meet overall BI and Analytics needs of customers, though Oracle has marked some of them on their roadmap to be available by next year or so.

  • Business Intelligence Publisher (BIP): As we know BIP is an integral part of Oracle BI offerings for pixel perfect document generation and it is currently missing in OAC. For my current POC exercise, I had to install it on a Java Cloud Service by following a complex configuration process.  
  • Delivers: OAC does not offer full delivery feature such as attachments in different formats, it does offer the report/analysis content to be delivered in the email body itself.
  • Security: OAC does not provide options in terms of configuring security such as enterprise LADP integration or configuring a custom credential provider.
  • Action Framework: OAC only provides within catalog and URL based navigation actions. Customer do expect JavaScript Action, EJB Action, Web service based Actions etc.
  • Customization on VM: Currently there is no mechanism to retain the custom configuration changes on the VM, which gets overwritten during patching /upgrade.
  • Custom Map Layer in Answer: Users cannot add custom map layers in the Answers though same can be added in DVCS. 

I have not covered one of the very exciting features of OAC, Oracle Day by Day mobile App. I will cover the same in my next blog as it requires more details.

I would like to conclude this article by stating that OAC would be a comprehensive cloud analytics platform to meet end to end BI and Analytics needs of an organization. It strikes right balance by providing advantages of cloud with on premises like control.

OBIEE: An effective tool for quality control in Credit Bureau Reporting by Auto-Finance Companies

Auto-finance Organizations in US have to report the credit data of their customers every month to Credit Reporting Agencies (CRAs) i.e. Experian, Equifax, Transunion and Innovis to comply with FCRA (Federal Credit Reporting Act) of US law. For this they have automated software programs in place which extract the account and consumer data from their source systems and transform/ load the data as per defined business logic into data warehousing tables before it is finally sent to CRAs in the format of Metro 2 files. This process is called 'Credit Bureau Reporting'.

But to ensure the integrity and accuracy of reported data, it's commonly adopted and recommended practice to do pre and post credit data transmission checks. Given below are brief details of such checks: -

 

Pre-Transmission Checks: - Following are general Pre-transmission checks done in the customer credit data. Corrective actions are to be taken in case of any issue observed during these checks. These can involve stopping the reporting of those accounts to CRAs for the current reporting month if needed, analyzing the root- cause of the discrepancy and fixing those issues so that the account can be properly reported either in the same reporting month or next reporting month onwards to CRAs: -

 

  1. Exclusion of manually identified or selected accounts using some predefined criteria

  2. Various kind of data quality checks; some examples are given below: -

    1. To find the accounts where both SSN (Social Security Number) and DOB (Date of Birth) are null for one or more consumers

    2. Invalid ECOA (Equal Credit Opportunity Act) Code value check for various consumers

    3. Invalid Payment Terms Frequency value check

  3. It can consist of even raw data validation which is generally sourced from Legacy systems of the client and processed as per the defined logic to prepare the Account/ consumer credit data for credit Bureau Reporting. Raw data is useful for validating the values of various parameters in Metro 2 file against the values in source systems.

  4. Picking random data- samples of various type of accounts e.g. current/ delinquent/ charged off to be reported and checking their data for correctness

 

Post-Transmission Checks: - After transmitting the customer credit data to CRAs,

When reject reports / feedback files are received from them, those need to be extracted and analyzed. Some examples of issues in the reject reports include Invalid Name/ Address /date of birth or any anomaly in the financial data of reported accounts in the Metro 2 file.

Using the Power of OBIEE: -For Pre and Post Transmission checks and preparing various Business Intelligence reports useful for Auto-finance business, various reporting tools are available in the market such as SAP BO, Tableau, Qlikview, IBM Cognos, OBIEE (also known as Oracle BI) etc. Among those one of the widely used effective tools is OBIEE (Oracle Business Intelligence Enterprise Edition). A reputed BI application from Oracle in the market, it is a portfolio of technology and applications which provides numerous important features for enterprise financial performance management such as integrated array of query, analysis, reporting, alerting, mobile analytics, data integration and management and data warehousing.

 

The diagram above gives an idea of how OBIEE can be used in quality control in credit bureau reporting process.

To enable various Pre Transmission data quality checks and raw data validation, data from all the required sources can be brought into OBIEE data warehouse and processed for reporting purposes.  Business required dashboards and reports can be built in OBIEE which would show no. of accounts/ consumers satisfying a particular data quality criterion or account/ consumer wise raw- data or details of randomly taken specific type of accounts. Based on the issues observed in these reports, the BI user can do the root-cause analysis and take corrective actions regarding reporting of the related accounts.

Similarly, to enable various Post Transmission data quality checks, using OBIEE the rejection details from CRAs can be processed to create required dashboards and reports for analysis in order to improve the credit bureau reporting process going forward.

 

Apart from these capabilities, following are the various reasons which make OBIEE a smart choice among the available reporting tools for quality control in the process of credit bureau reporting and auto-finance business: -

 

  1. Provides full range to BI capabilities that enables customer to collect up-to-date data from the organization

  2. Proactive intelligence and alerts

  3. Action framework which initiates business process in the context of insights

  4. Hierarchy- drilldown on data (i.e. higher to lower level data) is a very useful feature e.g. there is some year wise no. of repossessions shown in a report. If the user wants to see quarter / month wise details too within the same report, user can drilldown on year/ quarter value respectively.

  5. OBIEE has a unique inbuilt capability of BI Publisher. This is used to publish printable versions of various reports which can vary as per the provided templates as well as recipients.

These all capabilities enable the auto-finance companies to prepare various kind of powerful actionable credit reports of their customers and business to take better decisions, informed actions and implement more efficient business processes. Infosys has a strong expertise in the credit bureau reporting solutions as well as OBIEE space.

 

References:

http://www.oracle.com/technetwork/middleware/bi/overview/index.html

Oracle Integration Cloud Service - Overview

 

Oracle Integration Cloud Service - Overview

Software as a Service(SaaS):

We're using cloud applications more and more. Software as a service, or SaaS, adoption rates are increasing all the time. SaaS is simple and hence its usage is increasing. It's orders of magnitude are easier than deploying on premise software. You need new SaaS applications to be integrated with your existing SaaS applications and you're on premise applications. Cloud integrations should be as simple as SaaS and should be able to integrate your applications in minutes not months.

Platform as a Service(PaaS):

This is a service on which software can be developed and deployed. It is little above SaaS. The vendors of platform maintain the application servers, databases etc., and let the client focus on the development of application code. One of the example of PaaS is Oracle BICS.

Oracle BI Cloud Service Overview:

Oracle BI Cloud Service is the first Oracle's PaaS offering. BI Cloud Service is a proven enterprise-class BI platform on Oracle data centers and completely operated, managed by oracle. It helps to consolidate the data from any source and quickly create BI applications for agile analysis. Oracle handles all the patching, upgrading activities. The service can be either used on web browser or any mobile devices. All the interactive capabilities of doing analysis on the clouds like pivoting, doing all up style selections and groupings are available. And the ability to surface events analytics metrics, the ability to do time series analysis, hierarchical navigation, all available as BI cloud service user experience.

Integration Challenges:

Cloud integration can be very challenging. That is the reason so many companies have experienced cloud integration problems.  IT needs to understand various integration options supported by Cloud applications. Developers often need to be trained on Cloud applications and their integration technologies. Security is an issue. And there are usually multiple integration options. This results in a lot of manual development effort which increases cost and time to market. When what you really need is agile delivery of integrations between your Cloud applications and between your Cloud and on premise applications.

Solution:

The solution is Integration Cloud Service which integrates applications across clouds and on premise. It simplifies connectivity between key enterprise applications like Fusion Applications, Rightnow, E-Business Suite, and other third party SaaS and on premise applications.

Benefits:

With integration Cloud Service you get automated backup, path updates and upgrades leaving you to concentrate on the tasks of creating and integrating applications. Also there's no data mapping from scratch. In fact, no hand coded integration is required. Integration cloud service gives you an intuitive point and click experience to create your integrations. And you get enterprise grade security and governance capabilities. Integration cloud service comes with a library of adapters for Oracle and 3rd party applications. And you can purchase pre-built integrations for common scenarios. All this means you can integrate your applications faster thereby increasing your business agility.

Development Life cycle:

The first step in integration design is to identify the applications you want to integrate, and then configure the connection details for these applications. You do this by creating connections which are based on pre-built adapters. After your connections are in place, you create integrations by dragging and dropping these connections into the designer. You use the graphical mapper to tell integration cloud service how to map data between your applications. Then you're ready to activate, test and use your integration. To see how well how your integrations are running you use the monitoring dashboard. Here you can pinpoint the bottlenecks, monitor errors, and see details of each running integration. 

Oracle Blockchain Services - New kid on the block or something more?

Recently Oracle has joined the Hyperledger consortium to offer its own Blockchain cloud service. Using Hyperledger fabric Oracle prepares to offer an enterprise grade distributed ledger cloud platform with numerous benefits to new as well as its existing customers. So the question is it is another blockchain solution or is it something more than that? Here we get into more details to analyze and understand the Oracle offering, its strength and how it can really make a difference to its existing Oracle customer base!

My earlier white paper published on infosys.com discusses how integrating the blockchain with ERP can bring an end-to-end supply chain transparency, from the most basic raw material to the final product in procession of a consumer. The paper discussed challenges like interoperability between established enterprise platforms like ERP systems and blockchains. Blockchain as an enterprise chain infrastructure (not standalone Blockchain systems), and security and compliance for enterprises to interact and share solutions and transactions. So how do we overcome such challenges? As one says 'Necessity is the mother of all inventions' and Oracle Blockchain Services is one such invention which can really bring in a new era where Blockchain is no longer a separate, standalone system rather very much a part of the existing ERP architecture with similar technology and strengthening the enterprise-grade distributed ledger platform with this offering being on the cloud. Hyperledger fabric enabled Oracle Blockchain service on the cloud can help customers build new blockchain-based applications and help existing Oracle customers to extended their Oracle SaaS, PaaS, IaaS and EBS investments. Due to all Oracle architecture the technology component is dynamic and versatile with REST APIs, SOAP APIs and SOA architecture helping to integrate almost real time data across systems for better decision making.

This new kid on the block is something more than the existing products mainly because of the 420,000 customers which Oracle enjoys and the ability it provides in extending the use of EBS/SaaS/PaaS and databases technologies. Oracle by virtue of its unique position in the ERP market with dominant On Premise, cloud application products and best of breed integration architectures with Fusion middleware and SOA suites. With variety of customers in place, Oracle with all these backbone technologies can provide simple and easy integration ability for customers to be on the oracle blockchain network leveraging the existing manpower as well infrastructure and gaining supply chain transparency. With Oracle Database, Oracle Platform and Technology like SOA and APIs the blockchain application will be extremely scalable, secure, robust and tightly integrated. Additionally the Oracle Blockchain Cloud Service have Oracle Cloud App Dev platform and built-in DevOps capabilities thus providing flexible integration capability with cost effective deployment and support.

The Oracle Blockchain network is initiated by a founding member who creates all components required to run a fully-fledged blockchain network. It then adds other member organizations generally termed as peers. Oracle BCS comes with the multi-channel support feature. It's a unique feature that can help separate/ control the transaction transparency among various peers within the same blockchain network. So in a way it provides ability to segregate peers within a blockchain network. With these the Oracle Blockchain seems extremely promising and can easily help existing Oracle database and Application customers to integrate Oracle blockchain cloud services with their system and begin the journey of supply chain transparency!

Be with us on 3rd October 3:45PM for the Oracle Blockchain Cloud Service, Strategy & Roadmap session (CON6847) to get more details on Oracle BCS.

September 27, 2017

REAL-TIME WEB ANALYTICS

Real-Time Web Analytics is used for monitoring as well as reacting instantly to user interactions with social media/blog/website. In this competitive world, business needs to take decisions as early as possible and hence real time web analytics is necessary for generating actionable business insights.

The most important factor in the real time analytics is the latency time. Latency time is the time taken between the generation of data and its availability in the reporting layer for analytics. Hence the lesser the latency time, better the real-time analytics.

pic.png


In real time web analytics, typically we are connecting to the websites/blogs of the business through API by installing a plug in and generating security token. We can then find out the number of users connected to the site and from which geo location they are connected, how they are connected and through which OS platform they are connected. These details are then analyzed for real-time decision making as detailed below,

Number of users:

Using real time web analytics, we can determine the number of users connected to the website at any given point in time and thereby we can figure out the peak time as well as the best social media promotion (Facebook, Twitter, etc.) method for the website. We can also determine when to start the re-promotion activity by checking the number of users in the website.

For e.g. when a mobile network service provider company launches a social media campaign for a new plan, we can monitor its success based on the number of users interacting with the promotion through social media and clicking on the website links. When the count decreases, it's time to initiate re-promotion activity to continue to garner user support for the promotion.

Mode of platform:

Mode of platform is the various types of OS (windows, iOS, Android, etc) through which the site has been accessed. Using real time web analytics, we can determine which platform is widely used and customized the message/offer accordingly.

Region:

We can gather region wise data which is very useful for the organizations to determine which geographic locations are ahead in reach and which locations need more focused promotion strategy.

For e.g. From Real-Time analytics, we can figure out the number of users accessing the website by country/state/city/etc. This can then be used by business to decide which geo location needs better focus and tailor made offers/promotions to increase reach. Or alternatively understand why the reach is better for certain locations and not for others.

Subpages:

Real-Time web analytics can be used to gather data at the level of sub-pages/sub-links within the company's webpage. This can provide lot of insights on user behavior, preferences, UI of the site, etc.

For e.g. considering a mobile network service provider company, we can figure out whether the postpaid/prepaid plan pages has been viewed more by users. What are the promotional offers displayed has attracted maximum number of users, what were the search keywords, how many users register/sign-up, etc.

Data-Warehousing:

One of the best practices while using Real-Time analytics is to periodically archive the data in to a data-warehouse, so that it is available for Business Intelligence and Analytics using Reporting Tools. Also it is a good practice to reconcile the real time data with the data-warehouse to ensure they are in sync for accuracy and reliability.

September 26, 2017

Parameter considerations for BI Architecture

In our journey to implement OBIEE for a banking client , we had a business need to integrate existing OBIEE and the new CRM OBIEE instances which made us revisit the architecture to arrive on optimal solution.

Existing BI server on OBIEE had 300+ Financial analytical reports and client wants to implement Customer Relationship Management reports on top of the existing infrastructure.

Project Team considered following Parameters and Solution options for the discussions.

Option3.jpg






















X - Weightage based on client context. Can't be same for all programs.

Option A: Additional Standalone BI Server with existing BI server

Option B:  Reuse Existing BI Server

Voila!!  Team has agreed on the solution for CRM implementation, it was no brainer for everyone after the statistics were published. 

Solution options differ for each of the client and final decision can be different for the same challenge based on weightages driving clients requirement..





 

 


September 25, 2017

Why Knowledge Management in Enterprise?

KM Solution in Enterprise Eco System is a necessity than just an additional tool. When world is talking about automation, digitization, what and what not; KM Solution has huge contribution in enabling enterprise achieving goals and meeting objectives. This Quadrant explains where KM can really help Enterprise; Knowledge Sharing, Collaboration, Self Service, and Agent Productivity.  

ss-pic1.png
ss-pic2.png

Knowledge Sharing: 

Knowledge Sharing" is a huge problem across industry, team, business unit, you can keep adding area in the list. Small to Large companies spend millions of $$ training their employees every year and it keep increasing by almost 1% every year. Research also says with class room training only 10% skills can be enhanced. So, what about rest of the skill development. Those come from On-The-Job experience. Can we just deploy people on critical assignments post-classroom training? Answer is Yes and No. "Yes", because that's how you make him/her learn on-the-job. "No", because you cannot leave him/her alone and/or expose to customer. You need to mentor, coach, and provide right feedback at right time.  This is the issue in Service Industry, Corporate, Product Companies and any other. So, the question is can KM solve this issue? Probably yes. It gives a platform where people can learn, get help while on-the-job, mentor/coach can spend their some of the time in doing other productive work.

ss-pic2.png

Collaboration: 

The idea of collaboration is not new. We see people talking about it all the time and at all the places. But the question is how does this help, why is it so important and most importantly how KM helps? Too many questions and answer is also not very straight. Collaboration is something which has hidden return. This is probably something which has hidden return. This is not something which can be measured but certainly experienced. When we talk about team; can each one in the team work alone and at the end combining individual's work will complete the assignment? In most of the cases no. So, collaboration is required. When we say team, its collaboration, work together and make sure that team is moving in right direction from day one. Now what is KM? It's knowledge repository, brain dump of each and every one to help the mass. So, shouldn't Knowledge Base be an invaluable asset. Enabling collaboration feature in KM Solution helps contributor collaborate with peers, SMEs and other colleagues get their view point, feedback and input to ensure what is going in repository is thought through not something which is just written by someone for no reason. Collaboration in KM world brings KM maturity.
ss-pic2.png

Self Service: 

Companies spend unaccounted dollars to make sure customer is "Happy". This is not an easy task. There are so many delivery systems have been researched and implemented over the period to make customers happy. KM is one of them. KM has huge role in customer service. Customer can call to helpdesk, you can deploy very intelligent people in handling customer's call. This is all good but wouldn't it be good to avoid getting calls from customers at first place? Yes, it is easy to say but the question is how? Self Service is one of way to overcome from this problem. This is also called as "Case Deflection". Let customer find the right information by delivering information through omni channel. Deflect the problem by exposing the information they need. If you have mature KB repository it's difficult and brings huge value add in customer service.
picss-5.png

Agent Productivity: 

Companies spend unaccounted dollars to make sure customer is "Happy".Other side of Customer Satisfaction is how to increase the Agent's productivity. How to make sure that Agent is spending less time resolving the customer issue and at the same time able to handle more and more calls. One way is to deflect the customer problem by enabling Self Service. But it doesn't work alone. Agent should equally productive. To do so, KM should also be exposed to Agent. Agent actually needs little more than just exposing KM to them. It has to be Contextual. We don't want agent to scan through entire repository to look for the right answer of the problem customer is facing. Rather, system should be intelligent enough understand the context of the problem like who is the customer, area of concern, etc.Agents are the best contributors in maturing KM repository. They have tons of information within by working in diverse situation, so enable KM to agents to increase their productivity, get best ROI, make customers happy and progressing towards matured KM repository. Finally, these are said at many places and many times but still we have Industries and Companies lacking in this area. KM is an integral part of Automation, Digitization, and Customer Experience.


Who Can use It?

"Who needs Knowledge Management and Why?" seems difficult question but answer is simple "Everyone needs it". Every Enterprise needs Knowledge Management irrespective of industry they belong to. Below is why? 
ss-pic6.png

Thank You!

Shubhra Sinha

Shubhra_Sinha@infosys.com

September 21, 2017

Integrating On-Premise Security with Oracle Cloud Applications

When I was in a group discussion of late, I've seen people raising many interesting questions around Oracle Cloud Security and how this can be integrated with other on-premise applications. In fact, with the emerging cloud adoption, the modern-day IT Administrators are forced to think at a critical juncture and have more challenges than well-defined solutions before suggesting for an integrated security set up.

  • How can I restrict access to cloud applications that we have subscribed for?
  • Is it required for employees to memorize one more password for cloud access?
  • How the credentials are maintained for Cloud accounts? 
  • How to automatically remove cloud access for separated employees? 
  • How to get usage reports of my cloud application?

Identity set up and Access Management is an on-going problem. Most Organizations have their own setups for identity and Access Management to restrict access to all their on-premise applications. Business processes have evolved over the years and standardized with respect to in-built Identity and Access Management (IAM) systems such as Oracle's Identity and Access Management Suite. We have seen that business processes of most of small/medium type organizations are built with the concept of controlling user access by just implementing a Microsoft Active Directory (MAD) solution.

There is a need to control access to cloud applications as well without incurring more towards operational costs and without compromising on security for current business processes. Again, the new process needs to be simple. This is exactly where we can leverage a decade old technique called Security Assertion Markup Language standard which is flexible enough and very powerful.  As most organizations already know and maintain the identity of users in intranet or in Active Directory, it will be easy and good idea to use the same login information to enable the users logging into other cloud/web based applications.  One of the more graceful ways of doing this is by using Security Assertion Markup Language (SAML).  

As Oracle Cloud commences with the support of Single Sign on by using SAML 2.0 standard, now this is possible to configure Oracle Cloud Applications (call it as Service Provider) to trust the authorization\authentication data that comes from on-premise IAM systems (call it as Identity Provider).  Thus, we can manage and restrict access to Oracle Cloud Applications from on-premise IAM systems. Below are some advantages of this single sign on login standard.

  • Users are not required to type in credentials as it uses credentials of existing session.
  • End users feel it very convenient as they don't need to remember another password for Cloud Apps, their AD/intranet password works for their Oracle Cloud Apps too, no need to remember and renew passwords.
  • Admin can instantaneously revoke access to employees who left the organization.
  • We can get a consolidated view of on-premise versus cloud application access by running reports in existing IAM system.
  • Peace of mind for security admins as their Cloud Applications are protected with the same security policy that prevails in their in-house applications.

When we subscribe for any cloud service from Oracle such as RMCS, BICS, PBCS, the administrator of Identity Domain can log into My Services section and configure single sign-on by clicking UsersàSSO Configuration tab.


Here, administrators need to provide certificates and other related information of on-premise IAM system. Then they need to download Cloud Services information to recognize Oracle Cloud Apps from IAM system.

Once this is in place, when end user tries to access Oracle Cloud App, he is redirected to on-premise IAM system for authentication, after successful authentication, he will be redirected back to Cloud App which sends requested content to client browser.


The above diagram shows the flow for single sign on initiated by service provider. We may try with Identity provider-initiated SSO as well where end user directly interacts with On-premise IAM system denoted with blue line above.

How SAML work for SSO?

Here, SAML transfers the identity of the user from Identity Provider (IAM System) to Service Provider (Oracle Cloud) by mutually exchanging signed documents.

Let's assume a scenario where client is logged into an identity provider system, then he wants to log into a remote service provider cloud application. Then following happens:

  1. User accesses remote application by clicking a link and this loads application.
  2. Application identifies the user's origin and sends authentication request redirecting user back to the identity provider.
  3. The user establishes active browser session by logging into the identity provider.
  4. The identity provider sends the authentication response to service provider which is a XML document containing the user's username signing it using X.509 certificate
  5. The authentication response is then retrieved by service provider which will be validated by using existing certificate fingerprint of the identity provider.
  6. Thus, the client's identity is established and access will be allowed for application.

That's how organizations can have a full control over Oracle Cloud Apps by leveraging their IAM infrastructure and existing security policies.


Agile BI - Implementation through continuous process and Best practices

What is Agile BI?

Agile Business Intelligence (BI) is about delivering the bits and pieces of functionality in manageable chunks using shorter development cycles rather than implementing the whole BI functionality at once. Here the focus is on delivering quality product to meet the changing business needs within quick turnaround time.

Agile Sprint Overview

The Agile sprint starts with Planning and ends with Review & Retrospective. A brief stand up meeting called daily scrum is held every day to get an update on the progress.

The Sprint is a continuous process with one cycle followed by the next cycle without end and it includes the below elements of Scrum.

User Story: A User Story describes the business need in brief and it is a clear fine-grained requirement. Every user story gets a unique ID.

Epic: Epic is a large user story and usually span across multiple sprints. The Epic need to be broken into smaller stories before work on them.

Product Backlog: It houses all the stories planned for any sprint and is prepared by the product owner and ordered by priority.

Release Backlog: A release is a time frame within which all the iterations/sprints are completed. A release backlog is a subset of Product backlog which contains the user stories that are planned to be delivered in the coming release. Product owner with help from team and other stakeholders would select some amount of high priority work and move them to the release backlog.

Sprint Planning: The work to be performed in the sprint is planned at the sprint planning and Sprints should be in short iterations with some functionality delivered.

Scrum Meetings: The Daily scrum is a 15-30 min time boxed event for the development team to synchronize activities and plan for next 24 hours.

Sprint Retrospective: Retrospective's main aim is to discuss the results and determine the ways how to improve development process on the next sprint. The Scrum master hosts this and talks about the accomplishment of team in the last sprint and lesson learned.

 

 

 

                                              Fig 1. Sprint release flow

 

 

Scrum Agile Methodology

Scrum is an agile process that enables business requirements to be delivered in the short period of time and it allows the deployment of requirements into production every week or bi - weekly. Scrum projects make progress via series of "sprints". The roles and responsibilities in agile scrum process are -

Product owner: Takes care of the end user's interests and prioritizes the items based on customer usage.

Scrum master: Coordinates the whole scrum process and make sure that Scrum is used in a proper way and also holds the Scrum meetings.

Development Team: Responsible for the creation and delivery of BI solutions. This includes data modeling, ETL developing, data reporting or, testing, and release activities, as well as others.

BI/Data Architecture: Understand BI tools, processes, architecture and deliverables, coordinate any architecture integration, design solutions, and provide problem solving.

BI Testing: Test the reports, dashboard and ETL programs and validate the data sufficiency and accuracy.

 

Best practices for Agile BI Projects

Active participation of stake holders: The Active Stakeholder Participation is an essential practice in Agile BI developments to avoid any gaps and to have the update in business needs if required.

Break the User stories into meaningful and smaller chunks: If a User story is an epic, break them down as

1.      Stories for creating new dimensions, measures

2.      Stories to build the reports combining the dimensions and measures.

Breaking down the deliverable into multiple and manageable stories helps in assessing the effort accurately.

Use incremental and Iterative approach for DW: Propose Stories such that it increases dimensional and fact coverage incrementally and results in incremental Data Warehouse expansion. This would also avoid the rework from story to story.

Use Planning Poker tool for Story pointing: Agile teams use Planning Poker for estimating the user stories.

Planning Poker is a technique for story point estimation using consensus based approach and a deck of cards with values like 0, 1, 2, 3, 5, 8, 13, and 20 etc., given to each member of the team. Here the value represents story points. The Product Owner/Scrum master reads the description of each user story or theme to be estimated and each agile team member selects a card depending upon the value assigned to it. The process is repeated until team arrives at a single estimate.

Better story point estimation: Definition of "Done" should include both the development and acceptance testing within the same Sprint. This helps in better story point estimations, as we can compare the stories better.

Effective Communication: Agile project's success depends on communication, more so in case of distributed teams. Distributed teams should be provided with facilities such as video conferencing, screen sharing capabilities in addition to traditional messenger and conference bridges for better collaboration.

Team structure: Allocate specific resources of the development team to work on production bugs in order to avoid distractions that would impact the sprint deliverables. This helps the team to stay focused which results in better Sprint progress.

Conduct weekly Design Thinking Sessions: This session helps to resolve any potential design issues for planned user stories in the earlier stages of the Sprint. Design thinking sessions can be planned every week among agile team to review the design of deliverables.

Weblogic Password Re-set or Informatica Connect String change, we have a solution for you!!!

 

Would like to address two situations related to OBI Applications administration, of which one is related to weblogic Security and another related to DB connectivity of Informatica Server (Services).

1. Did you forgot your weblogic Administrative account/ Super User credentials?

In some of the general situations of a project we encounter some issues where we end up with no credentials or wrong credentials of some accounts or schemas. What if, the problem is related to Weblogic user (Here I refer weblogic as the AdminServer user / Super User of Weblogic 11g or 12c) ?

Some of the scenarios:

  • Improper KT's and handover of systems from other vendors during transfer of project

  • Did not maintain a proper record of passwords and result in no password or wrong password.

  • Least used systems and doesn't have proper tracking of details.

  • During installation we provide one value and we save another value in our records.

In few scenarios stated above, without being left out of no choice we may also end up to do a re-installation of OBIEE or FMW servers as AdminServer user is crucial and key to start and manage the servers. In such cases we can adopt a workaround using which we can re-set the AdminServer user i.e., weblogic password.

Backup:

  • Backup the OBIEE domain home i.e, $Domain_Home

  • Take a copy of boot.properties file of AdminServer from the following location $Domain_home/servers/AdminServer/security/boot.properties to maintain it as backup file.

    (This will work as a back-out/rollback scenario where the current boot.properties inputs are working to start the AdminServer but we don't have the physical password for Weblogic/AdminServer User)

Sequence of steps to be followed to re-set weblogic user password:

  1. Shut down the servers( All - Admin and Managed)

  2. Rename/move the data folder of AdminServer i.e., $Domain_home/servers/AdminServer/data

    <Example>

    $Domain_home/servers/AdminServer/data   to $Domain_home/servers/AdminServer/data_old

  3. To source environment variables run setDomainEnv.sh

  4. Navigate to $Domain_home/security  and run the following java property to re-set weblogic user password

    <Syntax>

    java weblogic.security.utils.AdminAccount <userid> <New_password> .

    <Example>

    java weblogic.security.utils.AdminAccount weblogic password123 .

  5. Update the details in boot.properties file of AdminServer i.e., $Domain_home/servers/AdminServer/security/boot.properties

  6. Once servers are triggered to start the hardcode values provided in boot.properties file will be encrypted. 

  7. Restart the servers.

 

2. Change in Informatica Database Connection and want to re-install? Re-Think !!!


In few projects, we come across some situations where we may end up in having some changes to DB configuration. Hence it impacts the applications dependent on that database. When it comes to Informatica Powercenter Services (Server), during installation of server we provide database values or provide a connect string to establish connection to database

Some of the scenarios in which Database values be changed:

  • DB hostname or IP is changed

  • Database/service name is changed.

  • You require to change the IP address to hostname in the Informatica DB connection string (assuming IP address of DB was provided during installation).

  • Multiple applications are hosted on a single database and in later point of time, we recognize to reduce load on the db and wish to move applications to separate databases. In such case, moving Informatica server from one database to a new database.

Few of the scenarios state above may end up/plan for re-installation of Informatica Server be hosted on a new database. In order to refrain from heavy work of re-installation we can adopt one workaround using which we can update the db parameter values used in Informatica gateway node connectivity to database.


Backup:

  • Nodemeta.xml file be backed up before making any changes. Location: <Infa_Home>/isp/config


An infasetup command UpdateGatewayNode be used to change the database connectivity parameters or connect string used. It's usage is illustrated as below:

  1. Shutdown the Informatica Services (domain, gateway node).

  2. Take the backup of nodemeta.xml file which is present in <Infa_Home>/isp/config.

  3. Run the infasetup command UpdateGatewayNode to update the new connect string values of database as follows: (<Infa_home>/server)

    For Oracle Database:

    infasetup.bat UpdateGatewayNode -cs "jdbc:informatica:oracle://<New DB IP or hostname>:<port number>;ServiceName=<New Service_Name>"

     For DB2 database:

    infasetup.bat UPdateGatewayNode -cs "jdbc:informatica:db2://<New DB IP or hostname>:<port number>;databasename=<New db_Name>"

     <Example>

    infasetup.bat UPdateGatewayNode -cs "jdbc:informatica:db2://devdb:60004;databasename=dev_db"

  4. Start the Informatica Services and confirm the changed DB details. 


Rollback/Back-out:

If there is any issue faced during the above stated implementation and services are unable to start, then remove the newly formed Nodemeta.xml and re-store the backed up Nodemeta.xml and restart the services.


September 19, 2017

Einstein Data Discovery - An impeccable "Data Scientist" from Salesforce


With an ambition to bring Artificial Intelligence to business users, Salesforce recently introduced the Data Discovery as Einstein Analytics product which empowers business users to spontaneously discover unidentified prospects and have richer vision by getting down to the bottom of organization's huge volumes of data instantly. Before I get down to the nitty-gritty about Einstein's Data Discovery, I think a simple definition of Artificial Intelligence is worth to mention here.

"Artificial Intelligence is the widespread concept of machines modeled in imitation of human mind or the way human mind accomplish tasks and considered to demonstrate intelligent behavior in a smarter and faster way"

Realistically human brain is not casted to analyze and match with the speed which the data is getting big every day across all the domains be it retail industry, manufacturers , financial bodies or health care.

Despite the fact that AI is upright at dissecting data and creating insights, business is not able to gain benefits from same. It is extremely challenging for business users to understand facts, scenarios and insights the way they are presented by AI. Since long analysts and data scientists are plugging this gap. To crack the concern of limited skilled resources, availability of data scientists and empower every business user to apply intelligence in parallel with the speed of business, now we have the perfect data scientist which is officially known as "Salesforce Einstein Data Discovery".

There are enormous "W" and "H" questions business executive's wants to know or rather they should know to make more mindful business decisions and they need to understand the questions in a narrative way. "Einstein Data Discovery is one narrative tool which allows them to discover insight and analyze massive volume datasets in minutes without getting troubled in building statistical data models".

It handles descriptive, diagnostic, predictive and even the prescriptive analytics which makes it capable of answering why something happened, what could happen in future and what action should be taken? It not only has easy guided recommendations for getting deep insight and answers but with Einstein Data Discovery one can even get to know the out of sight questions too.

I remember an instance from the conversation I had some time back while working closely with the VP of operations of a large beverage company. He was supposed to answer company's CFO in a very short time the reasons for the falling margins and what are the steps required to recover. The discussion was around how he made his team of data analysts and business executives to work overtime and burdened them with heaps of data and millions of transactions to segregate the potential problems causing a slip in margin. After spending days and lots of brainstorming from team members, he wasn't able to get clear answer on what actions should be taken even though he and his team known the business best but still was not able to leverage the domain knowledge.

In all such any many more daily business scenarios, having Salesforce Einstein Data Discovery readily available make it so relevant that you only need to load your data to Einstein Discovery and specify the variable you are looking for. By the time you get yourself a cup of coffee, Einstein Discovery is ready with the story line.

It unlocks the appropriate metric and its influence on business in an organized manner with a logical flow packed with insight about organization's data.  It points you to correct direction, provide relevant pattern and recommend answers and let you leverage the tool to take decisions based on what is statistically appropriate.

Einstein Data Discovery.JPG


Storytelling Behavior:  One of the key feature which makes Einstein discovery a bridge between business users and AI is the narrative explanation of insights which is what every business user needs to be self-sufficient and less dependent on data scientists. Once you are done importing dataset to Einstein Discovery, it assesses your data from every possible angle, do statistical test, detects patterns and evaluate the improvement opportunities. Once the analysis is ready which in terms of data discovery is called as a "story" is presented to you. It does not merely show the data but highlight the importance of it. The visualization includes a graphical representation of facts for example a bar chart with a descriptive text which is easily understandable to business users.

Automatic Discovery: Another significant ability is to automatically scan the data for quality problems, analyze the millions of datasets quickly and provide the underlying cause behind business outcome, what is next in line and what are the recommended actions. There is a recommendation section which gives insights on What Happened? What Changed over the time? Why it happened, what could happen and how can I improve it? It gives you the option to relate these questions to different variables to slice and dice the facts to dig down deeper and getting a broader picture

Flexibility to get data from any source: One will never have to worry about the source of data when working with Einstein data Discovery. It can take data direct from Salesforce or importing from a CSV file. It also works with any structured database viz Hadoop or SAP.

Natural Language Support: All the generated reports and insights are explained in natural language and can be exported in PowerPoint presentations and can be shared effortlessly.

Smart Data Quality Check: When you pull the data to get the stories created, Einstein Discovery automatically scan the data for data quality problems to see if all rows and columns are what we need. It identifies ways to improve the quality of your data-set and recommend the cleanup actions. For example while loading customer data, if the "Age" column has some text data which we might consider changing to number for analysis.

User Feedback Loop: Einstein Data Discovery have this great feature to refine the data model based on the feedback provided by users. Based on experience and expertise you can give a thumbs up or down to get more accurate analysis.

Salesforce Native: Einstein Data Discovery has native connectivity with Salesforce which allow users to work in parallel on Einstein discovery to get the insights and quick navigation to Einstein Analytics for taking required actions.


Einstein Data Discovery is certainly my personal data scientist now! I tried to provide as much information I could post here and will be happy to answer if there are any advance questions around this topic. Please write in comments section.

September 15, 2017

Artificial intelligence - the theme for next generation CRM

 

Artificial intelligence (or simply AI) is now everywhere, encompassing our lives all through. In its simplest form, it could be a reminder/alarm waking us up in the morning or auto switching off of lights at a specified time during night.   

Knowingly or unknowingly AI has been simplifying almost everything we do - with machines taking up the intelligence needed to do repetitive/systematic tasks, sometimes in unhuman conditions and at an unhuman pace.

Though there are debates about AI helping the machines to become self-aware posing a threat to the human race, heavy research and investment is being done in this area to augment human efforts and to provide more advanced and pleasant experience to the users.

Gartner predicts, by 2020 AI will be a top five investment priority for more than 30 percent of CIOs.

 

CRM evolution:

CRM products have come a long way from initially being data entry systems to now providing unified 360 degree view of customer for all aspects of customer interactions (marketing , Sales, Servicing) improving the customer experience.

Also, with the rise of SAAS model, the cost of cloud CRMs came in reach of small and medium businesses increasing the overall CRM adoption exponentially.

Gartner predicts the CRM market to be more than $37 billion in 2017.

However CRM products today still rely heavily on intelligence of users in most of the functions and can end up being a database of records if not used correctly.

e.g. During every interaction with a customer, the agent has to manually check previous interactions to understand Customer sentiment, understand what the user is looking for, think about cross sell / upsell opportunities or next best actions etc. - all while the customer is on call (or face to face).

Organizations have been attempting to automate some of these to increase Agent productivity.

 

Next generation CRM:

With the growing interest in AI and the market hype, it is a no-brainer that the next generation CRM products will boast AI as the main theme.


Tech giants are heavily investing in adding AI capabilities to their CRM products.

E.g. Adaptive intelligent Apps by Oracle, Einstein by SalesForce.com, Microsoft's Dynamics 365

 

Several start-ups are also building AI products that can augment CRM products with AI capabilities.

  e.g.

Chorus.io, Cogito, Conversica, TalkIQ etc.

 

 

Let's look at some simple CRM scenarios we can think of leveraging the power of AI.


  • Leads:

One of the toughest challenges in Sales and Marketing is to identify Leads that are highly probable of converting into Opportunity. In several cases, agents invest their efforts in losing Leads instead of concentrating on high probability Leads.

Wouldn't it be great if CRM can apply AI in terms of scientifically proven algorithms* to understand customer sentiment and predict probability of converting the Lead into Opportunity?

 

*The AI algorithms can be thought of as combination of multiple analytics techniques such as: 

Speech analytics - Based on Pitch, tone, pauses in customer voice

Text analytics - Based on email communication with customer

Data mining - Of relevant information shared by customer on public internet (Social media etc.) and the trends

Clickstream analysis - Analysis of user navigation on website pages

 

  • Servicing:

Before an Agent answers a Customer call, wouldn't it be great if CRM can readily display key information about the Customer based on Previous interaction (voice/email), information shared by the Customer on public internet (Social media) or smart alerts generated from Customer's product?

AI augmented CRM can also help Agents with systematic approach to resolve the issue for Customer, leveraging knowledge base created with machine learning.

 

  • Sales:

During the Sales cycle, AI can help Agents with ready information about cross sell / up sell opportunities, proactive product recommendations all based on information gathered for the Customer from public internet (including customer needs and the products Customer already uses).

  

  • Marketing Campaigns:

AI can help targeting most appropriate Customers to ensure successful outcome from Marketing campaigns.


 

So what is in it for the end Customers?


  • Today, in several cases Customers see it as a pain dealing with Service agents - having to repeat the information multiple times and not getting correct resolution in quick time.

            Servicing Capabilities with AI would help improving Customer experience to a new high.

  •       When it comes to Sales and Marketing, AI would help Organizations to proactively reach the Customers to satisfy what the Customer is really looking for and avoid bombarding Customers with suggestions/offers they are not interested in. 


These are only few of the common scenarios in CRM world and with AI capabilities in CRM the possibilities are endless.

Surely AI capabilities in CRM will greatly offload human effort increasing the efficiency and productivity while providing modern and pleasant experience to Customers.

 Your views are all welcome.

 

Rolling Forecast, its advantage and its implementation in Hyperion Planning

Overview of Rolling Forecast:

Dynamic business environment and economic challenges has made businesses to re-think their planning strategies. In such scenarios, rolling forecast methodology helps to deal with such uncertainties effectively. The annual budget becomes redundant once the actuals of first month or quarter become available, which makes a re-forecast a necessity. With re-forecast you will have most recent (real time) updated numbers leading to more informed decisions.

Rolling forecast is forward looking method based on the most current data. The granularity of these forecasts is not too high, thus taking shorter time to prepare them. Rolling forecasts can be done both on monthly and quarterly basis. Both have their own pros and cons. While a monthly forecast make require more efforts, a quarterly forecast doesn't give out timely information. Whether a forecast should be done monthly or quarterly should depend on the nature of the business. Businesses thriving in highly dynamic business environment might require a monthly rolling forecast. While a business in fairly static environment and one which follows a cyclical trend can opt for quarterly forecasts.

Traditional forecasting vs Rolling forecast

Traditional forecasting process focuses on current year and takes place monthly or quarterly. The value derived from prior plan, actual or forecast data but is not the most recent one and process also takes 2-3 days. Hence this process is not the accurate one and do not provide quality data to have business decisions.

Rolling forecast is a continuous process focusses on forward movement of data as it looks for next 12-18 months data and hence no annual planning and forecasting is required.

Why Rolling Forecasts?

  • It is forward looking process and hence eliminates need for annual planning/forecasting.
  • This process is ongoing and revised regularly based on most recent data hence you have better picture of what's going on in business to check impacts of previous strategy and make necessary corrections.
  • It is driver based planning and forecasting method so it allows you to have value derived from the other values.
  • It also reduces resource effort and cost and hence increases productivity and profitability.

We can say that rolling forecast is more proactive approach, focuses more on factors and analysis than data gathering, involving more teams/departments and hence providing most recent and accurate data for analysis and decision-making.

Rolling forecast in Hyperion Planning

On the user side, implementing a rolling forecast in Hyperion Planning it would affect the data source (re-forecast of future periods). The application administrator has to provide for functionality that addresses frequent changes to dimension members and create calculation logic to provide for the rolling functionality.

Below changes will be required to implement the rolling forecast in a Hyperion planning application:

  i. Changes in substitution variable

In this first step we are setting the substitution variables for respective scenarios.

A substitution variable becomes very important in a rolling forecast implementation. A substitution variable is a placeholder for members which changes regularly. Most common usage of these variables is for setting periods such as current month or current year. Once the value of these substitution variables change, the update happens in all locations where the particular variable is being used. This saves a lot of effort and makes application maintenance easier.

You can have a MAXL script to set the value for each scenario and every month a new MaxL would be run which changes the value of the substitution variable accordingly.

ii. Changes in planning application/ web forms

As you need to bring in the new scenario in planning application, the layout of the data form also need to update accordingly so that the user can view the new scenarios. Changes in web form includes putting year dimension in Page and put all the twelve months in columns to make it simpler for user to move from one year to another.

For a year-over-year forecast, there will be multiple columns in web form to show the periods which can extend up to multiple years. For example, In case of 12 month rolling forecast, if we forecast in month of Jan then forecast period will be till December of same year while in case of forecasting for other months, it will extend to next year.

iii. Changes in Business Rules/member formula/calculation scripts

Changes in calculations is also required so that it can accommodate the calculation to roll year over year. Usage of substitution variables is consider as best practice and is very helpful for rolling forecast implementation.

Conclusion:

Rolling forecast is an innovate approach used by the business nowadays so as to remove the burden of an annual plan. Although the approach is similar to the tradition method but the logic is completely different. It is ongoing, driver based, forward looking process to allow focus on data which they can control.

Thus looking at the pros concept of Rolling Forecast model we successfully implemented it into our existing system.


September 11, 2017

Get IFRS15, ASC606 Ready - The Easy and Quick Way

 

The accounting standard for recognizing revenue is changing.  For the new comers let me briefly describe the change.

What's the Change?

For countries using the IFRS standards, it means they now need to account revenue as per "IFRS 15- Revenue from Contracts with Customers" instead of "IAS- 18 - Revenue" standard to recognize the revenue.

For the US based companies needing to report under US GAAP, they now have to account revenue as per new "ASC 606 - Revenue from Contracts with Customers" instead of the old "ASC605 - revenue Recognition".

The old IAS 18 standards (issued by "International Accounting Standards Board (IASB)") and the ASC 605 (issued by "Financial Accounting Standards Board (FASB)" for the US companies) where having substantial differences. The new standards issued by the IASB, ASC i.e. IFRS15, ASC606 are now synched up.

The new standard outlines the below five logical steps for revenue recognition -

What's the big deal?

So - what the big deal? The accountants will take care - should the rest of you be worried? Accounting changes always keep happening, so what new now?

This is a big because it impacts the most important numbers on your P&L - the top-lines and the bottom lines and many other critical aspects like the taxes to be paid, the annual plans, probably the commissions, bonuses to be paid as well.

This is also big because the change is complex especially if you have bundled deals (like the telecom and hi-tech industry). Not all accounting software can do accounting as per the new standards. Apart from the accounting systems, business processes and maybe business contracts also might need modifications.

So, it is not just the finance guys / accountants - the board, CXO's, auditors, the Information technology folks, the planners, the analysts, sales teams, HR compensation teams - needs to understand the change and plan for the impacts.

Getting this wrong, has a direct impact on all key stake holders - shareholder value, employees (bonuses, commissions), government (taxes).

By when do you need to ready?

That depends on whether you are applying IFRS or US GAAP. For most of the companies the standard has to be adopted from the financial year starting in 2018.

  • For "IFRS15" applying companies
    • The financial year starting "on or after January 1, 2018"

  • For "Public business entities" and "not-for-profit entities that are conduit bond obligators applying US GAAP" -
    • The financial year starting "on or after December 16, 2017"

  • For the other "US GAAP" companies
    • The financial year starting "on or after December 16, 2018"


Are you late in the game? Probably yes, but....

This is where Oracle - Infosys can help you.

The Oracle Revenue Management Cloud Service (RMCS) is tailor made to meet the IFRS15 / ASC606 requirements including the transition requirements. The product has been successfully implemented across industries and is a proven solution for IFRS15 / ASC606 needs.

The "IFRS15 / ASC 606 solution" of Infosys is a complete solution to get IFRS15 / ASC 606 ready - quickly and perfectly. The Infosys solution encompasses program/project management, change management, implementation of RMCS, implementation of Financial Accounting Hub Reporting Cloud, building integration with various middleware. The Infosys solution  creates a robust process with tight integration ensuring automated reconciliation and no revenue leakage / discrepancies.

Considering the need to ready on time, the solution will prioritize requirements, so the MUST-HAVE requirements are developed, tested and ready on time.

Below is the overview of the Infosys Solution

How do we make a difference? How quickly can you be ready?

Infosys has been working on numerous IFRS15 / ASC 606 implementations using Oracle RMCS. While a typical IFRS implementation is done in 6-12 months implementation time depending on the number of integrations, use cases, we are able to cut the implementation time by at least 25% by levering the accelerators repository comprising of

  • Pre-built Use Cases

  • Key Decision Documents

  • Pre-Configured Instances

  • Data conversion templates

  • Configuration templates

  • Key learnings from other project

Apart from the normal implementation, we also offer a rapid implementation which can be completed in 3 month time-frame. A typically rapid implementation assumes not more than 2 integrations, conversions using FBDI templates, 30-40 use cases and 5 custom reports developments. The typical rapid implementation plan would be as below

Sample List of use cases for telecom:

S.No.

Description of the Use Case

1

Billing & Satisfaction: Single Handsets plus Plan

2

Billing & Satisfaction: Contract Termination

3

Billing & Satisfaction: Multiple Handsets Plus Plan

4

Billing & Satisfaction: Multiple Products and Plan

5

Billing & Satisfaction: Contract Modification

6

Billing & Satisfaction: Contract Add on

7

Downward Modification

8

Loyalty points - Termination

9

Family share - Multiple Lines

10

Loyalty points - Redemption

 

Sample List of Issues:

Below is the sample list of problem areas

  1. How to determine the SSP (standalone selling prices)

  2. Significant financing components on the contracts

  3. Managing impacts on cost - both direct and indirect

  4. Managing Discounts

  5. Managing Variable considerations


Get Started now, this is your last chance...

If you have still not started on the IFRS15 / ASC606 journey - you need to start now. With Oracle RMCS and Infosys experience in implementing the same, you now have the chance to be ready on time.

 

Meet our experts at @ Booth 1602, Oracle Open World 2017, to see a demo of the solution.

 

 



September 8, 2017

Features that really matters to Project accountants

Oracle has strengthened the already proven Projects track in Fusion Cloud. There are key features introduced in Fusion cloud project costing and billing that I would like to highlight and that makes this application as the winner and would help project accountants in efficiently closing books and reporting on projects.

Fusion Cloud Project Billing: -

·         One of the key feature in Fusion cloud projects is the flexibility in maintaining billing rules separately from setting of projects. Billing rules can be updated even after the project is created and cost collection has already started. This functionality has been really needed in scenarios when there is need to collect business development cost or pre-contract cost before the contract is finally negotiated with the client.

·         Fusion cloud billing allows to create manual Invoice from Projects module and therefore allows flexibility in picking which lines should be billed from the eligible transactions rather than system picking all eligible billing expenditures/events.

·         Performa/Draft invoice can be previewed using default templates provided in Fusion before interfacing to AR. Custom templates can also be developed in BI publisher based on the client requirement.

·         Invoicing can now be done on the transactions in T&M projects even without waiting for the cost transactions to be accounted.

·         Billing offset functionality has been provided to streamline the deferred revenue and unbilled receivables balances.

       Fusion Cloud Project Costing: -

·         Fusion cloud projects costing has now the most advanced cost collection process. Instead of running the cost import, distributing the cost, the project accountants can now import & distribute the cost in one go.

·         Manual project sub ledger journals can be created directly from projects by project accountants reducing the dependency on GL team. These journals can be used to capture accruals or pre-payments.

·         Cascading option of project dates to task dates is a new feature that allows project managers whether to update the task start date or finish date or no update on dates.

·         Excel based uploads of project creation, assets, costs & budgets has provided the fast upload of transactions and has greatly provided the much needed efficiency in these processes.

·         Exception based reporting helps project accountants to resolve month end issues rather than relying on running exceptions reports during month end or depending on the support team. Exceptions or warning messages in different programs are self-explanatory and provide the actions required to resolve the exception.

OTBI reporting particularly project cost reconciliation or accounting analysis report provide the accurate real time reporting to project accountants

September 7, 2017

Analytics and the APP!

Introduction:


Mobile devices have brought about a giant leap in the modern world, providing myriad combinations of services to be leveraged by users depending upon their need, or more so, creating opportunities for different needs. While stamping their presence in most avenues of daily life, there are still some areas where their application is recognized but has yet to catch up to their full potential. One such area of opportunity, especially in the world of IT, would be the use of analytics on mobile devices. The following scenarios are discussed to study this prospect in further detail.