Infosys and Salesforce accelerate enterprises in their journey to be a cloud-based customer centric organization. We deliver engaging customer experiences, drive smarter business decisions and co-create new business opportunities.

« July 2019 | Main | September 2019 »

August 29, 2019

Data Migration (into Salesforce)

Data migration projects are generally effort intensive & demands meticulous planning and control, there needs to be SME from both legacy CRM and Salesforce side who understand not only the functionality but have thorough understanding of data model of each systems.

Where and how to start?

1. Aligned Data migration timeline with that of E2E overall project plan.
2. Understand when the existing legacy CRM licenses are going to expire
3. Discuss with the stakeholders around when they want the migrated Salesforce data to be available for user testing
4. Discuss and decided data volumes, dates when the legacy CRM can be freeze, the attachments volumes to be loaded, etc.


Ideally, starting with a data migration plan and strategy is recommended.

Strategy should go to good amount of details such as 
Exact scope of data migration, Volumes, Dates of imp milestones, Tools being used, Out-of-scope, The migration approach, Environments that will be used for migration, whether it's big-bang migration or not,  How many rounds of testing will be performed, RACI matrix, Assumptions, dependencies, testing approach, etc.


From Salesforce perspective, the proven and stable migration approach would be as depicted below:


Legacy CRM   -->    Staging Area (with two data model)    --> Salesforce


Staging area can typically have two data model created (one which reflects the entities and relationships as per Legacy CRM) and (another which reflects the entities and relationships of Salesforce CRM)


Use the Staging area to apply Cleansing, De-Duplication, Enrichment, transformation and other business and archival rules and load the records into Staging area (SFDC data model).  This way, the final CSV file which is generated from Staging area (SFDC data model) will be nearly identical or perfectly ready for Salesforce team to upload using any data loader tools

On legacy CRM - One can use Sql loader or other tool based on the type of database (Oracle, MS, and IBM)
On SFDC CRM - if data volume is low - one can use Data Import wizard, Apex Data loader, Jitter bit, dataloader.io, Informatica data wizard, and many more.

Kindly note for data transformation or extraction - you might need to leverage iDSS Infosys data migration framework or any commercial ETL tools (Talend, Informatica, Mulesot, etc).

Below is a SFDC suggested order of migrating the entities (Indicative and not comprehensive list):
 0. Users
1.  Accounts
2.  Campaigns
3.  Contacts
4.Opportunities 
5.  Cases
6.  Solutions
7.  Price books
8.  Products
9.  Leads
10.Contracts


However the Order of above entities might change from one project to another depending on the existing relationship in legacy CRM but overall the order is as given above.

Some more tips:

§  As you move in additional data into your Salesforce instance, keep a watch on the space you are consuming. In case you are reaching your permitted space limits, it is time to call your Salesforce representative to purchase additional space

§  It is important to plan out the data migration. The users should be informed well in advance about the cut-off date, and possible issues that may happen. Also it is always a good idea to have a few pilot users available to test over the weekend in which a major data migration is planned.

§  It is also import to do sanity testing directly in Salesforce. Login as different types of users, and view a few records of different objects. Compare these same records manually with the original system. Although many tools are available, there is no replacement to manual verification when testing data migration

§  From a developer perspective it is important to plan/provide sufficient time for testing the results of data migration before you roll out the instance to users. Just because the data loader has executed without any errors does not mean that the migration is complete. Use the Developer console to perform basic queries like - total number of accounts of a certain record type, number of accounts without any contacts, number of accounts owned by user XYZ etc.

§  You need to evaluate if all the workflows and triggers of the impacted objects should be disabled before uploading data. Often there are active workflows that send emails to customers when certain stage is reached

§  It is easier to remove duplicates from a spreadsheet than merging them in Salesforce.

§  Use the Excel Connector or Apex Data Loader and V-lookup to compare new records against existing records before importing. 

§  Lookup Relationship-- > By Changing settings to "Clear the value of the field"

§  Master-Detail Relationship -> Sort Salesforce or External ID by ASC or DESC order to prevent record locking

§  Map the External Id for future traceability

§  For reports and dashboards to work on Salesforce just the way they did on legacy CRM, it's a good idea to enable the Audit fields in Salesforce where CREATED BY and CREATED DATE system fields can be edited via data loader during migration, this way the Created by user on Salesforce for the records will be exactly same as the one in your legacy CRM.  This will help in your OWD, Sharing rules to work seamlessly even after migration.

So dear readers, please comment on additional steps or precautions you would have taken or seen getting implemented in real projects. thanks for your kind knowledge sharing

August 26, 2019

Oops!! Whats wrong with my Salesforce sharing rules...

First time while I was preparing for my Administration Exam I watched this video "Who Sees What: Overview on salesforce" (https://www.youtube.com/watch?v=jDYfTfaqclk )... I was very impressed!   Later while working hands on I realized things can get really messy and could easily go out of hand if we don't really keep strong strategy and control.

Too many roles, too many sharing rules, complex Account teams, too many permission sets, these all could lead to loss of control on data visibility aspects in your Org. It is advisable to administer the sharing with strong change management process in place.

Who Sees What in Salesforce is fine, but when we come across a situation where we need to figure out why a particular entity, record or a field is being visible / editable by some other user - how should we approach such problem statement ?!

Recently in one of our engagements, post data migration - we had this exact issue to solve.  Client said that a particular USER was not supposed to see other account information, whereas when we simulate and log in behalf of the USER, we could see much more accounts than he / she we thought should be seeing in actual!

Now, Is there a logical way to solve or get to the root cause of these kind of sharing issues!  Let's check out. I am considering an ACCOUNT record visibility issue for illustration here:


PROFILE / PERMISSION SETS
Profile / Permission set controls CORE CRUD permissions for Object as well as Fields, so let's say this is bit easy.  Hence let's focus more on Record visibility, ability to read or edit the records of some other user which is where a lot many permutations / combinations are possible!

OWNERSHIP:
 Is the USER in question is actually the current owner of the Account record? check for this first

DEBUG LOGS:
Ok - Shall I enable Debug logs? And trace the log for any clue?!  -- Nopes, this won't help.  In salesforce the sharing reason is not revealed via debug logs ...

OWD:
 Hey!! What about OWD? Is it Public Read-only / Public Read write already? If yes, then you already got your answers... else if OWD = private, keep checking further...

SHARING RULES (Public group, Role, Role/Subordinate, Queue):
Is there any Sharing rule on Account? Is sharing rule based on Public group? Or Role and its subordinates? If yes, then quickly see if the USER in question is part of Public group!?   Or see if the Sharing rule is directly sharing to a particular ROLE?  Or Access is rolling up the hierarchy?  Or sharing rule is for ROLE+Sub-ordinates?!    If No sharing rule is found, then move on...

SHARING RULE OVERRIDE:
See if there are any Sharing rule over-ride?  I.e. exactly PROFILE / PERMISSION SET with VIEW ALL / MODIFY ALL permissions?!  Permission set - not associated either.

ACCOUNT TEAM:
Account team - is a feature to enable users with different level of access on a particular Account so they can work as a team, hence check if the USER in question is part of Account team?! Or at-least see if any of the Account team user is beneath this USER in the ROLE hierarchy? Or   if the USER is manager of the Account team member?!

ACCOUNT TEAM MEMBER / ACCOUNT SHARE:
You may use Workbench to query the Accounteammember entity along with AccountShare entity to see which all entities are shared with this USER and in what level?! I.e. READ / EDIT?
Did you get to root cause yet?! else keep continuing ...

ACCOUNT SHARE (Implicit, Manual, Owner, Team):
When you search on Accountshare - some sharing reason will be due to IMPLICIT permission, which means - because the USER is the owner of some of the child entities (i.e.  if USER is owner of Contact , then implicitly this USER will have READ access on the Account to which his Contact is linked to ),  same applied b/w Opportunity and ACCOUNTs

APEX SHARING:
Code based sharing is possible, however this is more of invoking a Class as system user or as a specific user and execute a logic

SHARING SETS:
If you are dealing with Communities user - better to check Sharing sets to see if there are any sharing b/w external users to internal users?!

OWD RECENT CHANGES:
See if someone changed OWD recently? This might lead to re-calculation of sharing - but at times, this will take more time.  This might have led to USER ability to see the record.

SFDC SUPPORT:
Are you tired, still not able to figure out root cause, raise a SFDC case and hope for some help!


Dear readers, Ifyou find any points above are technically incorrect, plz shout back - I shall revisit those points,  also if you have some more additional investigative tips, kindly do share through your valuable comments.  Knowledge is worth sharing... 

August 25, 2019

Setting up a Watchtower to Monitor, Measure and Manage your Vlocity Applications...

Why do you need a Watchtower for your Vlocity applications?

As the businesses are scaling up, our customers need to adapt quickly and have effective tools to measure, control and manage their application performance. For many of the major industries like Telecom, Healthcare or Finance, each second matters - for instance, by increasing the number of healthcare claim processes will directly improve the revenue of the organization. On the other hand, a frustrating customer experience due to slow order fulfillment of a telecom product such as a mobile connection will result in poor adoption. The business process should ensure the application performance is maintained to accommodate technology changes, the growth of our customer base, and regulatory changes, all without disrupting the customer experience.

 

What are the key factors which impact the performance of your Vlocity applications?

There are several factors that impact the performance of the Vlocity application built as a managed package of Salesforce org.

 

  1. Solution complexity - The basic solution depth and business complexity determine the level of customization on the Vlocity and underlying Salesforce org. Vlocity being a managed package has the dependency on other managed packages installed on the same org and share the org's resources and governor limits
  2. Data Management - The volume of data and the complexity of the data structure has a major impact on the performance
  3. Backend integration - Callouts to external applications and interface delays always contribute to interface delays. The real-time web services and batch feeds from backend applications need to be effective without adding any delay to the business process
  4. Network latency - Complex applications built on managed packages always have an impact due to network latency due to the location of the data centers
  5. Configuration settings - CPQ configuration settings also play an important role, hence care should be taken to ensure this does not result in sub-optimal performance of the application

 

How do we set up a Watchtower to monitor your Vlocity?

With multiple factors at play impacting your Vlocity performance, the process framework to set this Watchtower would constitute the following:

 

  1. Record and React for better Customer Experience - What is the current performance experienced by the user in his day to day operations? This focus of Cx monitoring should be collecting stats on the response times of the application due to browser performance factors, external system calls and level of optimization of DataRaptors.
    • The watchtower should also focus on enabling the Vlocity tracking entries and analyzing these tracking entries which will help in proactively managing the key Omniscripts
    • Review DataRaptors performance to ensure only the right data is queried required for one operation
    • The easiest way to monitor would be through periodic static code analysis which provides a gate on the quality of customization added onto the org.
    • Also - the application performance can be monitored by analyzing the response time of Vlocity components, application logs and tracking trust.com for any inherent application server performance issues.

 

  1. Track and Target Interface layer - Application built on Vlocity would rarely be a stand-alone application. For instance, the order management platform on Vlocity would be tightly coupled with the physical provisioning or billing applications. It is very important to focus on the interface calls made from and to the application. Event monitoring, debug logs and logs from the middleware.

 

  1. View and Visualize the complete business process - Apart from the Cx and interface, the key focus of the watchtower is also to view and help the teams visualize the end to end business process. The idea is to leverage an open source like ELK to track each order from creation to provisioning. This helps in building traceability of the process.

Vlocity Watch Tower.png

As customers face ever-changing business requirements, the need is to adapt quickly and innovate on the Vlocity / Salesforce ecosystem. The idea is to equip them with this watchtower with the right tools and processes to monitor, measure and manage the Vlocity applications for optimal Customer experience, effective interface calls, and uninterrupted business processes.

August 9, 2019

Delete Files Generic Lightning Component for All sObjects

ü  When to use?

Scenario 1: There are two users called Reviewer and SalesRep. SalesRep created a contract and uploaded a supporting file. Now the Reviewer was reviewing the contract and he/she finds the relevant uploaded file by SalesRep is not a valid file. Hence, he/she wanted to delete that file but he/she is not able to do that because he is not a Admin user and also Not the owner of the file because of lighting limitation.

Scenario 2: There is one single file called City Map which was linked to both Account and Contracts. The account city got changed. Hence a new city map uploaded and linked to both Account and Contracts. Now, the old city map supposed to be deleted by the end user to avoid confusion but he/she is not the owner of the file and not able to delete it. He/She first delink the file from the Account record and then from the Contract record but still the unused old city map file is in system and killing the file storage. That mean when it is last delink the customer wants to remove the file from the system to save file storage space.

ü  Where to use?

Any sObject record detail page like Account, Contact, Custom Object page as a Lightning Action button where file delete functionality by non admin user is required.

ü  How to use?

1. Create the Lightning component in your developer console as per the source code attached.

blog.PNG

2. Create the lightning action for the Objects you want to use it.

blog2.PNG

3. Add it to the page layout for the Objects you want to use it.

blog3.PNG

4. Demo Output Screen

blog4.PNGClone the source code of DeleteFiles lightning component code. You can extend the code or customize it as per your requirement. 

Subscribe to this blog's feed

Follow us on

Blogger Profiles

Infosys on Twitter


Categories