This blog is for Mainframe Modernization professionals to discuss and share perspectives, point of views and best practices around key trending topics. Discover the ART and science of mainframe modernization.

« March 2017 | Main | July 2017 »

April 19, 2017

Better Managing Your Modernization Process

There is much that has been debated about legacy modernization, yet this topic continues to stay relevant as enterprises are increasingly challenged by a new, digital, and highly competitive business environment.  The impetus for modernization is no longer borne from a need to reduce costs or upgrade to new technologies, but from a more compelling need to prepare for changing business requirements and future readiness.

Tab_Experience.png

Today, technology has become a core aspect of every enterprise, from IT companies to retailers, and is directly shaping business strategy. In preparation for this paradigm shift, organizations can either build new technological capabilities within by hiring required skills or partner with an organization who has the expertise to guide them through the process. Both methods have their pros and cons, though with shrinking time-to-market cycles, the later may seem preferable. Interestingly, Infosys DevOps for Mainframe leverage tools by IBM and Microfocus and created a PoC for a global bank. The tools enabled the bank to save approximately 15% of build effort and reduced storage by approximately 25%.

Legacy modernization to improve user experience of internal stakeholders

With technology changing at a continuous pace, enterprises need to prepare for stakeholders that are not only external, by way of customers, but internal, by way of employees, as well.

With internal stakeholders undoubtedly exposed to smarter technology and smoother user-friendly UX through apps, legacy technology and systems can be the cause of strain as it involves working with manual, time consuming systems.  Thus organizations find themselves at an inflection point when it comes to delivering uniform, high-quality services and experience, across stakeholders. A case in point is how the mainframe modernization team at Infosys helped a large Australian bank decommission complex legacy systems and reclaimed approximately two million AUD.

Technology can make IT a revenue centre

As technology begins to play a more pervasive and decisive role in an organization, technologist and CIOs and going to increasingly have a voice in determining the direction of the organization as their opinions are going to have a direct impact on revenues. IT will no longer be a cost center but a crucial value add, and become the tool that enables organizations to work faster and smarter. The challenge for IT of course will be on how to collaborate with business and develop roadmaps that enable seamless, non-disruptive modernization of legacy systems. This can be a significant challenge as these vast legacy systems have been developed over many years, in a fragmented manner and most organizations do not have a comprehensive understanding of their IT landscape.  Fear of their unknown IT landscape often prevents many organizations from beginning the modernization journey. 

Steps towards effective legacy modernization

Thus when an organization decides to begin modernizing its IT infrastructure, the first thing would be for it to map its IT landscape Organizations need to create a roadmap on what needs to be modernized and set realistic timeframes. Organizations also need to be cognizant of the fact that the pace of modernization is often determined by external forces like competition and the disruption that the industry is going through. For example at a large fashion retailer the landscape is being redefined completely by looking at what functionality will reside where.

A second aspect of legacy modernization is the need for an organization to consciously undergo a mindset and culture change. CIOs need to be closely invested in bringing about this change and there needs to be a buy-in from business leaders to make this top-down process a success. Processes in organizations should support this change - from recruitment to training to performance management.

A third aspect of modernization is risk mitigation. Since modernization is a journey that can take two or three years, organizations need to engage in this process without impacting their customers or internal stakeholders. Additionally, organizations need to be willing to invest and allocate an annual budget towards optimization of critical applications where possible, and legacy modernization where necessary. Budgets are also required to acquire new skill sets that will make this possible, as well.

The fourth aspect is how we can leverage technology advantageously. Cloud and digital are the key destinations of change for all industries. Organizations need to identify applications that can move to the cloud, and the extent to which they can leverage mobile and open source. An organizations business and technical landscape is a key determinant here, and the level of disruption facing that particular industry.

It could start with rehosting applications on the cloud to accelerate performance. We could renew legacy applications by the effective use of DevOps, creating APIs, extreme deconstruction and this transition to a new user experiences.

Leading US based bank improved its business turnover by exposing legacy assets as SOAP (Simple Object Access Protocol) from multiplatform integration. This enables cross selling as a direct business benefit.

Migrating mainframe batch jobs to Hadoop helped an European CPG company reduce the time it took to reconcile financial statements by 40%.

In summary, mainframe modernization is real and now. How much and at what pace is specific to a client. We have the expertise and technology available today to rapidly move to more open business models and technology architecture.

April 5, 2017

Legacy Decommissioning (2/2)- Data Archival and Retrieval

In continuation to the earlier blog of decommissioning legacy systems where I explained the overall decommissioning process, this blog will address the data archival and retrieval approach in the design and execution phase of the legacy decommissioning.

As discussed in Blog 1, many organizations feel forced to keep aging legacy applications running, way beyond their life because they contain critical historical data that must stay accessible. This information may be needed for customer service or other operational reasons, or to comply with industry regulations. Yet keeping obsolete systems alive just to view the data, puts a real strain on resources. These applications steadily consume IT budget in areas such as maintenance charges, staffing and data center costs and in many cases it is over 50% of overall IT budget.

Data Archival during application decommissioning is the most cost-effective and simplest solution for keeping legacy data accessible for continuation of the business. Archiving complete legacy data at once is one of the best practice suggested during application decommissioning. This archived data can then be swiftly accessed online and can provide different views of the data for analytics or can be exported into different format when needed. During data archiving the data can be extracted from any legacy system and store it in a secure, centralize online archive. The data is easily accessible to end users, either from screens that mimic the original application or in a new format chosen by the business. The new infrastructure built for legacy data archival should be capable of moving all legacy data components, whether it is structured, unstructured, semi-structured, or even raw metadata, into a comprehensive online cloud based self-managed centralized repository. Infosys has proven methodology for archiving the data in this central repository.

Application decommissioning and data archival requirement are unique for every organization, Infosys with its proven framework and tools can help customers with below:-

1.       Building a strong understanding of the current application data model
2.       Building data retention policies and retention requirements through strong domain knowledge.
3.       Building the innovative archival data model.

The above approach is largely adopted for mainframe legacy applications which had accumulated large volumes of data over many years in form of documents and images. Typical examples are

1.       Billing records,
2.       Financial transactions
3.       Customer history. Etc.

 

Data Archival

The generic framework for a Data Archival process consists of following steps

* Data Extraction - Collect and extract the data from source data base into interim storage area i.e. Staging Area and perform data transformation where required to map the data format of the Target state e.g. EBCIDIC to ASCII, Alphanumeric to Numeric, Date format changes etc.

* Validation and Cleansing - Validate the schema of the target database such as Tables with all the constraints, indexes, views, Users along with their roles and Privileges are migrated as defined in business rules. Validate the contents of the data migrated to confirm the referential integrity is maintained in the target definition. If required, data cleaning is also performed for generation of golden records, removal of duplicate records, Cleansing of special char, spaces and emails.

* Transformation - Transform the data from the source to the target as defined in data mapping rules and lookup tables.

* Data Migration - Load the data into the target database using the data loader utilities / scripts and programs generated for loading the incremental data, multi lingual data and recovery of failure data.

Data archival solution designing for extracting and migrating the identified data source is usually done collaboratively with customer DBA's and SME's. Complete analysis of the application is done to identify the following information:

* Understand the current data Model in the legacy application

* Identify the unknown data relationships in current model.

* Creating retention policies of the data identified for data archival

* Data extraction with required transformation in the application context

* Identifying the key fields for indexing to search the required data efficiently

* Validating the target database schema and data contends of the archived data.

* Creating interface to access archived data and reports independent of the application

* Applying the application, entity, records level retention policies based on organizational requirements.

Based on above analysis the target data model will be generated and each target entity will be identified as business object and will have the data for the object. The business object is defined keeping the regulatory and business needs into consideration.

Based on organizations requirements and details collected during impact analysis, Infosys will suggest best approach to archive the data by evaluating multiple factors such as quantity of data to be archived and actual requirement of the data archival, two options are

1.       Complete data archival process at once
2.       Partial data archival over through multiple releases.

Data Retrieval

Data retrieval is to ensure the archived data is accessible anytime for the audits and analytics for normal business operations. Also the data is secured and can be accessed as per the user roles and privileges. Two most common strategies are Hot retrieval and Cold retrieval of the data. In Hot retrieval data is accessed immediately based of few keywords. In Cold retrieval data is accessed via reports and service request when specific view of data is needed.

Following are few ways through which we can retrieve the archived data:-

      • Data retrieval through keyword search and business objects to give full application context.
      • Custom generated reports using Data integration capabilities.
      • Data retrieval using standard interfaces to the databases such as ODBC/JDBC and SQL
      • Using enterprise reporting tools such as Crystal Reports
      • Data archival and retrieval using third party products such as IBM Optim, Infomatica etc.

Subscribe to this blog's feed

Follow us on

Blogger Profiles

Infosys on Twitter


Categories