Infosys’ blog on industry solutions, trends, business process transformation and global implementation in Oracle.

August 9, 2018

Gamify your Sales Automation (Baseball theme example included!)

Looking for a way to bring some excitement, motivation and a sense of competition within your sales force but do not necessarily want to spend extra dollars on incentive payouts? Do you want to improve user adoption of your sales automation application but not spend time and effort on retraining or having to listen to complaints from the sales team about how the system is no good? Gamification may be the answer that you are looking for.

Gamification in sales automation refers to creating game like scenarios which include principles like garnering points, rankings, competition etc. to motivate your sales teams in a non- monetary way, although in some cases, points earned may also be redeemed for non-cash incentives if the organization so chooses. The objectives for gamification may be manifold:

Process adherence Organizations may have trouble getting their sales teams to follow recommended sales processes. Examples may include updating contact information or capturing minutes of meeting with clients. Such activities may even seem to be trivial to sales managers who do not wish spend time discussing these items with their teams and who might rather spend their time discussing more 'important' matters like specific opportunity details, sales forecasts, pipeline review etc. Gamification can address such situations effectively by reinforcing ideal behavior through reward of points to salesreps who follow the recommended sales process.

User Adoption Organizations implement sales automation software only to find that their sales teams couldn't be bothered to use them. Gamification can be a reason for the sales reps and managers to start using the application and lead them in understanding the benefits of sales automation.

Sales Engagement Sales resources tend to work in isolation. They are on the road constantly meeting clients and prospects and there isn't enough time to build employee engagement. Any internal office meeting tends to be formal reviews and planning exercises which can be quite serious affairs. Gamification can help to reduce tensions within sales teams, bring some fun into an office culture and bring about some good-natured competition and a feeling of 'know thy team'.

Gamify Sales Activities

Below are some models or examples of how a simple gamification can be designed for routine sales activities using a points system. Salesreps can be notified on their points accumulated and also be ranked vis a vis other sales reps.

Gamify activities performed on Lead and Opportunity objects by assigning them suitable points. For example

  • Creating a Lead gets you 1 point.
  • If the Lead is Qualified, you get 2 Points.
  • If the Lead is converted to an Opportunity, you get 3 points, and so on.
Similar gamification can be performed on Account and Contact objects. For example,

  • Creating an Account gets you 1 point
  • Adding a contact to the account gets you 2 points
  • If the contact is a decision maker (Title VP or higher), you get 3 points and so on.
Gamify Sales Performance Metrics

Sales performance metrics can be mapped to sports themes. Below is an example of mapping them to a baseball theme. Such gamification can then be included as part of the salesrep's profile which is viewed by everybody in the organization. Similar themes around other sports or games can be creatively designed.

  • Batting Average- % of Leads that get converted to Opportunity. A batting average of 0.250 means 1 out of 4 Leads are getting converted
  • On Base Percentage- % of Leads that get converted to Opportunity but also includes walks (standalone opportunities)
  • Slugging Percentage- % of Won Opportunities upon total opportunities (A slugging percentage of 0.500 means 50% of Opportunities are won)
  • Home Runs- Number of high value opportunities won (say above 10k)
  • Hits- Number of Opportunities that reached a particular sales stage (say Submit Quote)
  • Runs Scored- Number of Opportunities Won
  • Assists- Number of Opportunities won where the salesrep is not the owner but on the sales team
  • Errors- Number of Stale Opportunities
Note that control functionalties may have to be built in to the game mechanics to ensure that sales users don't enter dummy or wrong data to win points or to score more.

Hope the above gave you some ideas on how you can gamify your sales teams!

August 7, 2018

OAC Essbase Application Migrations from On-Premise

OAC Essbase Application Migrations from On-Premise:

Introduction

I am working on the Oracle EPM Cloud implementation project which is focused on migrating on-premise Essbase applications to Oracle Analytics Cloud (OAC) Essbase cloud applications. The OAC Essbase provides few utilities which can be used for exporting applications from On-Premise application and importing it into OAC environment. This document explains about how to migrate the Essbase applications to OAC using utilities.

 

Utilities

Below are utilities which is downloaded from OAC environment. I have downloaded the  Export Utility and Command Line Tool to my local machine for migrating on-premise applications to OAC Essbase.

                1. EssbaseLCMUtility

                2. CommandLineUtility

 

Utilities.png

 

Prerequisites

       To use the Essbase LCM utility and Command line interface, Java JDK 8 should be installed and the JAVA_HOME path should be set.

       On-Premises Essbase applications should be converted to Unicode mode (UTF-8 encoding) before migrating into OAC Essbase environment. I used the below maxl script to convert the application into Unicode mode.

I have changed the Server level variables to Application level variables.

Exporting On-Premise Application

       I have followed the blow steps to export the on-premise application using Essbase LCM Utility.

       Open the 'CMD' and change the directory to 'D:\EssbaseLCMUtility' where I have downloaded the utility.

       Run the below command to export the application from Essbase. This command exports the data and artifacts.

                EssbaseLCM.bat export -server  test.xxx.local:1423 -user TestAdmin

                -password ******** -application TEST -zipfile     TEST.zip -skipProvision

 

Export Application - Progress

 

Export.png

Application Export Folder

-          The Application folder is exported to EssbaseLCMUtility Folder.

ExportLocation.png

 

Importing the application into OAC

I have manually copied the exported application folder to CommandLineUtility home folder.

 

CommandLineUtility.png

 
 

 Importing the application to OAC

·         I have executed the below CLI commands to import the application folder into OAC Essbase.

·         Set the Java Home and CLI Home:

                                SET CLI_HOME=D:\CommandLineUtility

                                SET JAVA_HOME=C:\Program Files\Java\jdk1.8.0_161

·         Logging into OAC:

                                esscs login -user TestAdmin -url https://test.xxx.com/essbase

·         Importing Application into OAC:

                                Esscs lcmimport -v -zipfilename TEST.zip

 

Import Application - Progress

 

Import.png

 

Application in OAC Environment

Application migration is succeesful. Go to OAC application console and refresh the application list to see the migrated application.

 

Console.png

 

Reference(s):

1. https://docs.oracle.com/en/cloud/paas/analytics-cloud/essug/preparing-migrate-premises-applications-cloud-service.html.

2. http://www.redstk.com/migrating-an-essbase-cube-to-the-oracle-analytics-cloud/.

 

Automating the Purge Job for Usage Tracking Data

 

Why Usage Tracking?

Usage Tracking will be helpful in measuring and monitoring user interactions with the OBIEE. This will provide the deep understanding into usage statistics and performance congestions in the reporting application.

Usage tracking functionality creates entry into S_NQ_ACCT table as when a report is executed in the application by a user.

This table will capture metrics like report performance time, report start/end time, user ID etc.,

When Usage tracking is enabled, it helpful in determining which user queries are creating performance bottlenecks, based on query response time.

It also provides information on frequently accessed reports. It involves in Enterprise Manager set up changes and RPD changes.

Why Automate the Data Purge?

For a reporting application which receives user requests every minute, Usage Tracking will generate huge volume of data. This data gets written in S_NQ_ACCT database table. Purging data in this table periodically is essential. Otherwise reports created on top of usage tracking data would perform slowly. Manually purging this data requires intervention from database team and add overhead in application maintenance.

We can automate data-purging in S_NQ_ACCT table using BI Publisher technology. This automation will work for any data-purging. Also the entire automation can be done with technology stack that exists with BI application. There is no need to involve any additional tools.

Steps to Automate:

  1. Create a BI Publisher data source with READ WRITE user for OBIEE meta-data schema.

  2. Create a database package which deletes records from S_NQ_ACCT table.

  3. Create a BI Publisher data-model to invoke the DB package via Event Trigger.

  4. Create a scheduled job which will invoke the data-model periodically.

 

  1. Create a BI Publisher Data Source

     

Go to BI Publisher administration, Click on JDBC connection as shown below.

Click on "Add Data Source"

 

 

 

 

Enter the following details for a New Connection

     Data Source Name: Give a Data source name.

    Username: Enter the user name who have Read and write access to OBIEE meta schema to access the data source.

     Password: Enter the password associated with the user name.

Click on Test Connection, a confirmation message will be displayed.

  1. Create a Data Model to know how many records got purged.

    Go to New TabàData Model

     

 

 

 

Enter the following details

Default Data Source: Select the Data source which is created in above step from dropdown.

Oracle DB Default package: Enter package name which is created in the database in the OBIEE Meta schema.

 

Attached the package code for Reference.

 

 

 

Click on Data Sets and select SQL Query as shown below.

 

Enter the following Details

Name: Enter a Data set name

Data Source: Select the newly created Data source from the dropdown

Write the Query and click on OK.

Query: SELECT COUNT(1) num_deleted_records FROM APLINST02BI_BIPLATFORM.S_NQ_ACCT WHERE start_dt < SYSDATE - (:m)

 

Create an Event Trigger to initialize the Data model after Report gets triggered by the Scheduler.

Enter the Event trigger details as below

Name: Enter the name of the Event Trigger

Type: After Data

Language: PL/SQL

Oracle Default package will be populated automatically. Select the appropriate function which will trigger the Report from the Available function section and move to the Event Trigger section by Click on ">" icon

 

Now click on Parameters and provide the parameter details to pass it in Event trigger. We are passing the number of days with this parameter to purge the data from the S_NQ_ACCT table with below logic.

DELETE FROM Schema_Name.s_nq_acct WHERE start_dt < sysdate - m;

 

 

 

  1. Create a RTF template for Scheduling a Job to automate

Go to NewàReport

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Click on Upload

Rtf Document for reference.

 

 

 

 

Save the report in the required folder after providing the above details.

Now click on MoreàSchedule

Enter the parameter value as 'm'

 

 

 

 

Enter the details for scheduler as below

Name: Give a name of the file name

Layout: This will be populated automatically

Format: Select a required format of the Report form the dropdown

In the Destination section, select the Destination as Email or FTP and provide the details accordingly.

In the Schedule tab, give the frequency of the Job when to run.

 

Now click on Submit Job from top right corner. A job will be scheduled as per given details below

August 6, 2018

Mitigating Low User Adoption in Sales Automation

So the project went live perfectly. It was on-time, on-budget and all the success criteria were met. Except one. This is a nightmare scenario for many project managers and sponsors, when the sales automation project that they have worked so hard on for many months and executed perfectly (in their opinion) does not seem to enthuse the end users resulting in the very common problem of Low user Adoption. In this blog we are specifically talking about low user adoption related to Sales Automation projects although many aspects could be common with other endeavors as well.

Below are the major causes and mitigation for Low User adoption:

Lack of Process Adherence or "We don't work that way"

Often in the hurry to implement 'best practices' and a 'vanilla solution', short shrift is given to some core processes in the sales organization. Sometimes, in a global implementation, processes are 'standardized' without a real buy-in from regional stakeholders who may perceive that their way of doing business has not been heard sufficiently. 
Mitigation: Get explicit sign-offs and buy-in from stakeholders when processes get modified. Build in customizations where required to ensure core processes are protected.

Lack of Trust or "Is my data secure?"
Another reason that your sales reps are reticent in sharing information on the sales automation application is due to Lack of Trust. For sales reps, their contact and account information is gold. They do not want just anybody in the organization having access to their contact and account details. Sales Teams may not have a problem with their managers accessing data but may not want say, the marketing team to get the access to their contact details without their knowledge. If their misgivings in this regard are not addressed, you will find that they may not be updating their most important information. 
Mitigation: Most software today comes with configurable security parameters. You should ask your SI to implement suitable security configurations that balance both the need for sales effectiveness and address the trust issues of your sales teams. 

Lack of Business Commitment or "Even my Manager doesn't use it"
Many times Sales Automation projects focus only on the direct sales reps as the end user. This is a mistake because although direct sales reps may form the largest part of the sales force, when other sales roles like sales managers, partner managers, key account managers are not included, it is perceived by the direct sales team that they have been saddled with an unnecessary burden. This results in them not taking the implementation seriously thus resulting in low user adoption.
Mitigation: It is important that companies take a strategic view when it comes to sales automation and implement functionalities in the software that benefits their entire sales organization. Hence we recommend to implement modules like Sales Forecasting management which requires sales managers to review forecasts from their reps and in turn submit to their managers. Modules like Partner Relationship Management are used by Partner managers to manage sales processes using the partner organization. Customer Data Management and Incentive Compensation functionalities involves the sales operations teams to ensure data quality and sales incentives through the sales automation product.

Lack of Knowledge or "Not sure how it works"
Most SIs and implementation consultants work on the "Train the Trainer" model were key users from the sales organization are trained on various aspects of the application. It is then expected that these Key Users in turn will go back and ensure a quality End User training. Many companies ignore or do not pay enough attention to this phase of the project since vendors are not involved in this process. It is not surprising then that end users forget about the inadequate training they received and go back to their old way of doing things.
Mitigation: It is important that enough thought is put into the End User trainings as well. If the number of end users are large, it should be treated as a separate project and vendors can be involved in this phase of the project as well. Enough and appropriate training collateral should be developed and rollout should be planned so as individual attention can be given to each participant in these training sessions. Follow up or refresher training can also be organized on a need basis.

Lack of Productivity or "The Old way was better"
Although sales effectiveness, improved sales reporting, sales collaboration are all important reasons to implement a sales automation application, user adoption of such software will suffer if sales reps feel that all these benefits are happening at the cost of their productivity. Companies should guard against building a 'comprehensive solution' as that may mean that sales reps have to spend more time on the application when they would rather be selling and having face time with their prospects and customers.
Mitigation: Sales Productivity should be an important success criteria metric and included as part of all requirements and design conversations. Data entry requirements should be kept to the minimum mandatory fields and rest of the fields can be optional. Application performance should be tested comprehensively to ensure that improvements can be made before go-live. Mobility and Outlook Synch functionalities should be explored to improve productivities. 

Lack of Perceived Value or "What's in it for me?"
This is perhaps the most important question that needs to be answered in terms of user adoption. Unless the sales automation helps the sales reps to meet his personal career goals, he is not going to spend time on the application. It is important that he perceives the application as a tool that will improve his sales effectiveness, help him get recognition, and advance his career.
Mitigation: Sales automation software should focus on sales effectiveness improvements which could mean sales collaboration, new technology interventions like AI-ML to help the salesrep focus on the important leads and improving his win rates. Intelligent analytics that provide not just information but also insights on his key concerns and suggesting a workable plan of action. Sales Performance and Gamification solutions can work on top of the base solution to provide value in real terms to the sales users.

Keeping Track
It is important to measure the User Adoption through analytical reports to try and understand the status of user adoption even after applying many of the above mitigation measures. Reports should give an adoption breakdown by region, role etc to answer questions like which sales roles are using or not using the application. Which country's users are lagging behind? Answers to such questions will help the IT organization to take suitable interventions and corrective measures. All the best on your user adoption journey!

Oracle Data Visualization (DVD/DVCS) Implementation for Advanced Analytics and Machine Learning

Oracle Data Visualization Desktop(DVD) or Cloud Server(DVCS) is a very intuitive tool, which helps every business user in the organization to create quick and effective analytics very easily. People at all level can leverage the benefit of blending and analysing data in just a few clicks and help the organization to take informed decision using actionable insights. Oracle DVD is a Tableau like interactive tool which helps to create analysis on-the-fly using any type data from any platform, be it on premise or Cloud. Main benefits of Oracle DVDs are below:

·         A personal single user desktop tool, or a SAAS cloud service, which can be leveraged by any business user in the Organization.

·         Enable the desktop user to work even offline

·         Completely private analysis of heterogeneous data

·         Business user can have entire control over the dataset/connections

·         Direct access to on premise or cloud data sources

·         Administration task has been removed completely

·         No concept of remote server infrastructure

Oracle DVD/DVCS enables the business user to perform analysis using traditional methodologies as well as provides capability to perform Advance Analytics using R and creating Predictive model using Machine Learning algorithm using Python.

This simple and intuitive tool provides a very unique way to enable you to perform Advance analytics by just installing all the required packages. DVML (Data Visualization Machine Learning library) is the tool to help you install all the required packages for implementing machine learning algorithm for predictive analysis in one go.

Install Advance Analytics(R) utility will help you to install all the required R packages to perform Advanced Analytics functions like Regression, Clustering, Trend line etc. However, to run both the utility in your personal system/server, you need administrative access as well as access to internet and permission to automatically download all the required packages.


In the below slides we are going to discuss, how to leverage Advance analytics and machine learning functions to provide predictive analytics for the organization.

In order to create a Trend line graph, we need to enable Advanced Analytics and then pull required column into the Analysis.

Trend line Function: This function takes 3 parameters to visualize the data in a trending format.

Syntax: TRENDLINE(numeric_expr, ([series]) BY ([partitionBy]), model_type, result_type)

Example : TRENDLINE(revenue, (calendar_year, calendar_quarter, calendar_month) BY (product), 'LINEAR', 'VALUE')

We need to create various canvases and put them into one story line by providing corresponding description over the canvas. While creating Trend line visualization, we need to provide the Confidence level of data. By default, it will take 95% confidence level, which means the analysis will be performed over the 95% of data.


Continue reading " Oracle Data Visualization (DVD/DVCS) Implementation for Advanced Analytics and Machine Learning " »

August 3, 2018

OAC-Essbase Data Load & Dimension Build Using CLI

OAC-Essbase Data Load & Dimension Build Using CLI

 

Introduction

I am working with one of the Oracle EPM Cloud implementation projects which is focused on migrating on-premise Essbase applications to Oracle Analytics Cloud (OAC) Essbase cloud applications. The OAC Essbase provides Command Line Interface utilities which can be used for data load and dimension build in OAC Essbase applications. This document explains about how to use the utility for data load and dimension build in OAC.


Utilities

Command Line Utility - We can download the Command Line Tool from OAC Essbase instance to our local machine to perform the Essbase data load and dimension build tasks

 Utilities.png

Setting up CLI environment

  • Open the command prompt and change the directory to CLI home directory.
  • To use Command line interface, Java JDK 8 should be installed and the JAVA_HOME path should be set
  • Set the CLI Home and Java Home:

                                SET CLI_HOME=D:\CommandLineUtility

                                SET JAVA_HOME=C:\Program Files\Java\jdk1.8.0_161

 

Logging into OAC Essbase through CLI

Before performing Dimension build and data load activities, we need to be logged into OAC Essbase.

Logging into OAC using admin id:

D:\CommandLineUtility> esscs login -user TestAdmin -password ****** -url https://test.OAC.com/essbase

                user " TestAdmin " logged in with "service_administrator" role

                Oracle Analytics Cloud - Essbase version = 12.2.1.1.112, build = 211

 

Create Data Base local Connection

-                        The DB local connection can be created using CLI command createlocalconnection. It takes all the required JDBC connection details as arguments.

Command Syntax:

D:\CommandLineUtility>esscs createLocalConnection -name oraConn -connectionString jdbc:oracle:thin:@DevDW:1XXX/DevID -user DB_USER

                Connection already exists, it will be overwritten

                Enter Password:

                User credentials stored successfully



Essbase Dimension Build

  • Run the dim build command with stream option 
  • Database query is required either in the rules file or must be provided as argument for dimbuild. If not given in command, it is taken from the rules file.
  • The streaming API is used to push the result from database to cube.

Command Syntax:

D:\CommandLineUtility>esscs dimbuild -application TEST -db TEST -rule Acct.rul -stream -restructureOption ALL_DATA -connection oraConn

                Streaming to Essbase...

                Streamed 9 rows to cube

 

Essbase Data Load

  •  Run the Data load command with stream option 
  • Database query is required either in the rules file or must be provided as argument for data load. If not given in command, it is taken from   the rules file

Command Syntax:

D:\CommandLineUtility>esscs dataload [-v] -application TEST -db TEST -rule DataLoad.rul -stream -connection oraConn

                Streaming to Essbase...

                Streamed 10 rows to cube


Reference(s):

1.https://docs.oracle.com/en/cloud/paas/analytics-cloud/essug/command-line-interface-cli.html.

2.https://support.oracle.com/epmos/faces/DocumentDisplay?_afrLoop=442454943227210&parent=EXTERNAL_SEARCH&sourceId=HOWTO&id=2259032.1&_afrWindowMode=0&_adf.ctrl-state=aphtjq7kl_132.

 

Continue reading " OAC-Essbase Data Load & Dimension Build Using CLI " »

July 14, 2018

Robotic Process Automation - Capabilities Overview

 

Robotic Process Automation - Capabilities Overview

Understanding the Basics

 


 

 

Introduction

Every few years, IT / ITeS industry is seeing a new product or technology which bring exciting new UI features, capabilities for users to configure the application easily and a lot of new buzz words and concepts for technology enthusiasts to get accustomed of.

Robotic Process Automation and Artificial Intelligence are 2 such buzz words which have excited fast growing organizations and IT industry equally and are really catching up fast.

While for the organizations, it opens up new avenues to achieve higher operational efficiencies and cost reduction, eventually impacting the bottom line; for IT industry it opens up new horizons to increase the client base with new offerings and increase the technological footprint.

In this blog, we'll focus on the overview of Robotic Process Automation - the basic understanding of RPA, the types of RPA and what encourages the fast growing organizations to go for it.

 

What is RPA?

In this competitive consumer market, organizations face a perpetual challenge of moving swiftly while keeping the costs of operations low while increasing the consumer satisfaction level and service offering quality.

Though organizations are aware that to reduce the costs, they have to achieve higher operational efficiencies; however, there is a direct impact on bottom line if they hire more staff to achieve that. Increasing the working hours, paying overtimes to existing staff again dents the profits. To alleviate these challenges, organizations are finding their savior in 'Robots'.

In simple words, RPA tools (Robots) emulate the manual steps as done by users across and through applications from UI entries. RPA tools operate by mapping a rule-based workflow process which the "robot" can follow to the 'T'.

An important point to note here is that these Robots can be implemented agnostic of system or application. These Robots can be as simple as batch file upload automation to as advanced as a cognitive automation which has self-learning, variable format processing capabilities. Processes can be triggered manually or automatically to:

·         Populate data across different systems and modules within them

·         Run queries on a scheduled basis and perform data reconciliation

·         Generate and distribute reports

·         Auditing of large volumes of data

·         Trigger downstream activities and processes

Per proven studies conducted by various institutions and agencies, it has been identified that in large organizations, there is typically a scope of saving 20-40% of workload for employees with help of automation and imagine what levels organizations can achieve with employees having 20-40% more of their time to focus on value added tasks.

 

Types of Automation

As mentioned above, there is plethora of tasks which Robots can do starting from simple, mundane activities like data entry to super complex activities like generating a dynamic response to user query based on machine learning and cognitive abilities.

There are different stages or levels of Intelligent Automation:-

·         Digital Worker - As evident by the name, this is the primary or entry level of automation an organization can move ahead with and still achieve efficiency gains.  This is the typical Robotics Process Automation tool which can perform tasks like:

o   Data entry

o   Running functions in excel towards data validation

o   Triggering customized emails with preset content or standard templates

o   Data comparison

o   Setting up reminders

o   Batch processing and populating mapped fields

o   Queuing and Assignment

 

·         Digital Reader - This is a secondary level of automation alternatively referred as 'Cognitive Automation'. Robots at this stage can perform tasks involving:

o   Machine Learning

o   Pattern or Keyword based recognition which is evolving over time as Robot sees and identifies more patterns / keywords

o   Data processing across variable formats

o   Dynamic queue assignment based on patterns

o   Complex analysis based on continual learning 

 

·         Digital Talker - This is an automation offering which focuses on providing a more interactive experience. Alexa from Amazon, Google Home are very popular examples of this. These robots are also called 'ChatBots' as they perform somewhat similar tasks as the previous classification of Digital Reader'; however have additional text and voice capabilities and are more communication focused. Robots at this stage can perform additional tasks involving:

o   Predictive Analysis

o   Customer Servicing

o   Query Resolution based on pattern or keyword based recognition which is evolving over time as Robot sees and identifies more patterns / keywords

 

·         Digital Thinker - This is the advanced level of automation which is the classic Artificial Intelligence. Artificial Intelligence tools are somewhat comparable to humans in terms of intelligence and have their own IQ. Currently, the IQ of these AI tools is significantly lesser than that of humans. Per studies performed in 2016, IQ of Google's A.I. (47.28) is nearly two times that of Siri (23.94), however a six-year-old child beats both of them when it comes to smartness and thinking capability. An average person's IQ is in range of 85-114.

As the IQ of these applications or tools increase, to a certain point it'll be beneficial for people and once the IQ surpasses that of average human, then we all know what will happen - we all have seen sci-fi moviesJ.

Nonetheless, the Digital thinker can perform below activities in addition to the activities listed for previous categories:

o   Predictive Analysis based on cognitive learning and complex algorithm 

o   Complex Mathematical Analysis

 

 

RPA Benefits

 

RPA_Diagram_1.jpg

Conclusion

Organizations need to be smart enough to understand their IT Landscape, business process steps and identify the correct tasks which can be automated with help of Robots. Although other competitors might be at a higher level of automation, an organization needs to be realistic in its approach and move with Automation stages through a proper strategy and careful planning to reap the benefits of automation.

 

Reference

1.       https://www.cnbc.com/2017/10/02/google-ai-has-almost-twice-the-iq-of-siri-says-study.html

2.       https://en.wikipedia.org/wiki/Robotic_process_automation

July 5, 2018

GDPR - Its impact on CX


General Data Protection Regulation (GDPR) came into force in European Union (EU) member states from 25th May 2018. It has far reaching ramifications for businesses and organizations given that data is ubiquitous and all businesses today rely on customer data to remain competitive in their industry and relevant to their customers.


In this blog we will examine some of the challenges that businesses in certain industries can face and what businesses can do about it. GDPR restricts itself to personal data thereby limiting its regulatory reach to all such companies and organizations that are serving direct consumers of their services.


In this era where advertisements on social media, advertisements on web pages, advertisements on mobile applications are personalized by gathering and processing information about the specific user how can companies that use these media of connecting with their customers continue to send pertinent communication/messages to their customers.


In retail ecommerce customers are shown recommended products using association rules and recommender systems. This is possible because the company keeps track of customers past purchases (past buying behaviour) so as to recommend new products to the buyer.


After implementation of GDPR the following can happen


  • The buyer can refuse the ecommerce company to control and process his/her data. This at once nullifies all the investment it has made in processing this buyer's data as it is brick-walled from its customer.
  • On the flip side it gives a level playing field to other ecommerce companies as every buyer out in the market is anybody's customer. In short, customer loyalty will be short lived.

So how can organizations and companies insulate themselves from losing out their customers? The answer is simple and has stood the test of time - roll out the best service to each customer whether the customer is buying from them for the first time or the hundredth time. Companies will have to relentlessly satisfy customers in every transaction so that customers willingly share their data. Period.

 According to Epsilon research, 80% of customers are more likely to do business with a company if the company provides personalized service. With a possible destruction of customer data after completion of transaction as stipulated in GDPR


  • It is difficult for companies to personalize their offerings to "customers".
  • Customer profitability KPIs like Life time Value(LTV) may not be meaningful anymore as the same buyer is a new customer each time if the buyer chooses to annul his/her personal data after completion of every transaction.
  • Newer catch-phrases like Customer Journey Mapping fall off the grid as the "traveler" in the "journey" is temporary and companies may not even know the "traveler" i.e. the customer.

So how can companies personalize their services to customers? Prudent companies can anonymize customer data by encrypting it immediately after sourcing it. Though this will not help them decrypt to find the specific customer the still company has some sort of a handle on its customer.

Companies in the financial services rely on accurate, updated and complete customer data to discern genuine customers from fraudulent ones. To keep good customers separate from bad ones companies will have to be innovative to "pseudonymize" customer data.

So how does this work?

GDPR only regulates personal data and not transactional data. So financial service organizations will have to "pseudonymize" customer data using new technology mechanisms (which may or may not exist today) so that customer data is also treated as transactional data. Such transactional data can then be trained using Machine learning/Deep learning algorithms to spot fraudulent customers from reentering the financial services market.

All data is stored in servers and server farms on the cloud or in in-house data centres. As the financial cost of misdemeanor in following the GDPR is very high (ban on customer data processing and a fine of up to higher of €20 million or 4% of the business's total annual worldwide turnover) the IT and ITES industry may also not be immune to impacts. The following impacts may be notices


  • There may be instances where the processor of the data (the organization that defines the how and why of customer data) may move the data on-premise thereby playing the role of controller of data as well. The controller is the one holding the data like AWS.
  • Small businesses may be tempted to move from cloud to on-premise to reduce chances of data theft or rework their contracts with data controllers to insure themselves.

With the widespread use of data science and machine learning in business, companies would have to be very diligent in deleting customer data from training data that is used to build supervised algorithms if a customer asks for deleting his/her personal data that is part of training data. If many customers follow suit then the model so built is itself now rendered inefficient as the training data has changed and patterns have to be learnt again. Companies will have to keep their learning algorithms and models updated regularly so that their outputs are pertinent.

GDPR puts the onus of processing data on companies and organizations and awards private individuals complete rights over the way their data can be stored and processed. As individuals become custodians of their data they may choose with whom and for how long they may share their data. Is it possible in the future that large groups of users form cartels and charge businesses for using their data?


June 22, 2018

Become smarter by improving Customer Experience with Blockchain solution

In this digital era, how often we see an executive from the bank visiting a customer's office or home to collect identity documents physically and verify the details as part of fulfilling regulatory demand - Know Your Customer (KYC).

The advent of technology nowadays has helped banks to collect identity data real-time and verify the received data connecting to central data repository. In spite of this maturity, financial institutions are still losing operational efficiency and margin towards KYC activities as customer identity is yet not fully digitized. They continue to look out for more effective solutions to achieve key customer experience goals such as reducing document handling touch points and there by save valuable time of customer, bring in transparency, protect confidentiality and secure sensitive personal identity documents supplied by the customer.

Blockchain technology, a distributed database providing speed and ease offers a high potential solution to simplify the identity management process in KYC use cases through digitization and achieve greater customer experience quotients. Blockchain computing architecture uses self-sovereign identity concept, a model where one's own identify is maintained by self, provides transparency on entities accessing the owned identity data. Using this model, the verification of a customer identity in real time is performed, which brings in substantial cost reduction by eliminating disparate systems and technologies in the enterprise architecture, secures data transaction paths by bringing together disjointed data pockets from different sources into a unified trusted immutable digital view, encrypted using strong cryptographic techniques. This architectural model effectively solves the KYC use case needs which every financial institution in today's digital world executes.

Fig. Simple representation of Identity management process model built on Blockchain architecture 

A typical process model adopted by identity management platforms leveraging Blockchain architecture,where retrieval plus provision of customer (identity) data handled securely with bi-directional authentication is sketched above. To illustrate briefly, customer registers his/her identity information (Passport, Pan, Aaadhar, etc.) digitally in a Blockchain based KYC platform and submits the details to bank while opening a new account/seeking a new service with them. The bank taking the submitted inputs seeks the necessary identity documents (Passport, Pan, Aaadhar, etc.) digitally from the customer subscribed KYC utility/platform. Upon receiving the request from bank, the KYC platform acknowledges the request and undertakes customer's consent though OTP/email verification link towards sharing the digital identity view. The solution establishes transparency to the customer on identity data exchanged, protects sensitive identity data from intruders/hackers to ensure high level of data integrity and prevents misuse of identity data in any manner. More interestingly, this architecture also provides the flexibility to re-use the digital identify profile more than one time across different banks/institutions or any organizations a customer may interact with.

In spite of the promising capabilities discussed above, being a new technology change wave, the Blockchain solution has to yet over ground towards establishing ownership definitions for identity information handled, becoming a standardized solution for KYC use cases across geographies and under banked regions, where banking processes are yet to be completely established, financially viable and digitized. However, this technology promises a lot possessing the capability to completely digitize and automate KYC use cases. An exciting development in the current digital economy indeed.

June 20, 2018

The marriage of Agile and Waterfall in a Telco World

 

We often hear of organizations undergoing the aches and pains associated with the transition from a traditional waterfall to a fast paced agile model. During the course of this transformation, the waterfall model is looked down upon as slow or rigid that hinders the responsiveness to business needs. However, that does not mean Agile is most appropriate in all cases.  What probably works is something that lies within the spectrum.

Continue reading " The marriage of Agile and Waterfall in a Telco World " »

Subscribe to this blog's feed

Follow us on

Blogger Profiles

Infosys on Twitter