This blog is for Mainframe Modernization professionals to discuss and share perspectives, point of views and best practices around key trending topics. Discover the ART and science of mainframe modernization.

April 26, 2018

Let's Begin your Legacy to Digital Transformation Journey [2/3]

In continuation to my previous blog about the involvement of Legacy Modernization to make the IT systems intelligent, today I will share my thoughts on how organizations can adopt Digital Solutions across the enterprise. Legacy Modernization is the stepping stone for any transformation journey. Let's have a look at some of the opportunities.


Legacy to Digital

Legacy systems present in an organization limit them from creating an agile environment and enhance process efficiency. Modernizing the same could mean cloud adoption, enabling DevOps, improved user interface, etc. that not only help build a competitive advantage with improved time to market and customer experience but also lead to Digital enablement. Let's look at some of the concerns this raises for tech. leaders:

Rapid globalization and increasing technical debts are important factors driving technology leaders to adopt cloud solutions across the enterprise to help create flexible and agile infrastructure for ease of deployment of applications. But one of the biggest obstacle arises from the fear of risk associated with migration, and return on investment.

Can the existing business logic be preserved after cloud migration?

Rehosting helps migrate application to cloud as-is, without changing the existing business logic and the code, using emulation software which drastically reduces cost and provides the same experience to the end users. If an organization has a vision for large, enterprise wide transformation but faces hardware lease expiry issue, a temporary fix is to migrate to cloud which would help customers avoid expenditure on scaling-up new hardware for growing business needs.

 Does cloud adoption only help in cost reduction?

  No. To reap the benefit of business agility in the cloud ecosystem, applications need to be re-engineered as cloud native. So, the application must be scaled across multiple nodes instead of monolithic code so they become fault tolerant and improve system elasticity, with the option of scale up and scale down individual nodes. Through microservices, the monolithic code can be broken down and services can be deployed independently.

Does modernization beneficial to the employees of the organization as well as improve customer experience?

The green screens associated with legacy systems act as a hindrance in enhancing the customer experience as these screens cannot be exposed directly as APIs to the external world. The legacy green screens lack the features needed for mobility and also incur significant employee training costs to the organization. These green screens can be modernized on to advanced technologies such as AngularJS, ReactJS etc. This will help organizations eliminate command based front-end with Dynamic Graphical User Interface (GUI) which improves employee productivity, and will also decouple the business layer to enable flexibility in adopting newer technologies.

Process automation

The traditional legacy system developers have monthly/ quarterly releases. But in the current dynamic business environment, speed to market plays a vital role in gaining market share. DevOps automates end to end activities throughout the project life cycle which can lead to faster rollout of products /services and enable continuous delivery and frequent customer feedback to improve the quality of service.


I do believe Legacy migration has almost become mandatory for organizations wanting to adopt digital solutions across the enterprise. In my next blog, I will share my thoughts on the role of Legacy Modernization in "Mesh" Strategies. Please post your suggestions and feedback, I would be very glad to answer all your queries.


February 21, 2018

Why is Legacy Modernization a key Influencer for your Tech. Strategy this year?

Gartner has named Intelligent Digital Mesh as the strategic trend of 2018. In a technology driven business model, higher level of automation using Artificial Intelligence, personalized end-user experience and Digitization of Business enables an organization to become a digital enterprise. However, the existing legacy systems like Mainframe and AS400 hold organizations back from being able to adopt newer technologies; almost making Legacy Modernization irremissible for large scale transformation. In this blog, I'm penning down my views on why and how Legacy Modernization is required to make your systems 'Intelligent'.



Intelligent Systems

Organizations seek multiple ways of eliminating redundant, paper-based work to improve application efficiency and create competitive edge which comes from automation in the form of Artificial Intelligence, Machine Learning, Rules Engine, etc. This requires the technology landscape to be agile and flexible. Legacy modernization solution helps migrate outdated technologies to newer platforms such as Open Source System and Cloud, which are more suitable for deployment of Analytics and Machine Learning and harvesting key business rules from the code which can be loaded onto Rules Engine for changing business needs.

Embarking on a Digitization Campaign requires an analysis of existing environment to define a roadmap and choose the appropriate target platform. The dependency on Subject Matter Experts (SMEs) to understand the existing environment delays the modernization project. Hence, using Artificial Intelligence on log files, current code base etc., helps provide recommendations for improvements to the existing system. So based on these analysis,

  • Application/system improvement: Recurring application/system issues can be fixed/improved and overall system downtime can be reduced.
  • Process improvement: Insights help in improving the overall application process by creating new applications e.g. a password management app and a reduced resource overhead.
  • Alert system: It is easy to use these reports with an alerting system to generate active alerts for proactive response.

Data is one of the most valuable assets that can act as a source of innovation. But migrating data from legacy system involves multiple challenges such as EBCIDIC format, varying data types, metadata, etc. Migrating to a Data Warehouse also incurs additional costs of analyzing and profiling of data. Therefore, the target platform should be able to support large volume of data irrespective of the format that can be achieved through Data Lake (with Data Lake, raw data can be stored from all sources and can be transformed whenever required, thus enabling the flexibility to install compatible tools for generating insights).

The key business rules and logic are ingrained into the code which require significant effort to adapt to the changing business needs. To ease this, these rules can be separated from code and fed into a rules engine. This will provide flexibility to business leaders to change the business logic as and when required.

Recently, Mainframe have also come up with features such as Java in Mainframe and zAnalytics to reduce customer attrition. Adoption of Intelligent in the organization coupled with the right strategy can provide IT leaders with innumerable benefits and legacy modernization will play a key role in achieving the same.

In my next blog, I will share my thoughts on the role of Legacy Modernization in "Digital" and "Mesh" strategies. Meanwhile, I'll be happy to hear your thoughts, suggestions and feedback.



Continue reading "Why is Legacy Modernization a key Influencer for your Tech. Strategy this year?" »

January 3, 2018

Big Iron for Analytics

Some like to call it the 'Big Iron', while a few refer to it as a dinosaur- a callback to its invention in the 1950s. The Mainframe has come a long way. Let us take some market trends and how mainframe copes up with them.  

What are the current trends driving the market?
  • Data: Is growing exponentially, with newer data types
  • Data Generators: There are newer streams through which data   is being generated like smart devices, GPS location of users,  logs etc. 

Technology trends
  •        Nearly half of total sales is influenced by Digital organizations
  •        Nearly 20% of organizations that go digital tend to have an upper hand over competitors

Data Insights
  •        75% of CIOs will recognize the limitations of traditional IT and embrace a leadership approach that embodies a          virtuous cycle of innovation
  •        40% of IT projects will create new digital services and revenue streams that monetize data
  •        zAnalytics is available from model Z13 and upwards of machines
  •        For real time analytics, additional IBM products for quick Db2 table access are needed
  •        Will have to find a way to store newer data types in Mainframe

Is the Big Iron capable of addressing these trends and associated requirements?

The answer is "partially yes". Depending on the specific use case, a few will be suitable for, and addressed through zAanalytics.  

But first, what is zAnalytics?

It is an offering from IBM, that enables organizations to undertake analytics and machine learning on the mainframe. 

Advantages of zAnalytics

  •       Data In-place: There is no need to move data out of mainframes as zAnalytics
                        - Is capable of reading data from DB2 table, VSAM, PS files , IMS DB
                        - Avoids data duplication and latency
  •       Security: Users continue to benefit from the highest level of security that Mainframes offer
  •       Engine:  It uses the Apache Spark engine to divide and process the data in-memory
  •       On-line and batch: Both online and batch workloads are supported for analytics
  •       Specialty engine offloading: Few of the workloads can be offloaded to the zIIP engine, which further reduces the         cost of execution

·         What are the constraints of zAnalytics?

  •          zAnalytics is available from model Z13 and upwards of machines
  •          For real time analytics, additional IBM products for quick Db2 table access are needed
  •          Will have to find a way to store newer data types in Mainframe 

To evaluate zAnalytics, we have written a simple application with a simple use case. 

Let's explore the use case we are looking at:

It's a simple application as follows

  1.            Customer File: This file consists of 1 Million records consisting of information below
  2.             Transaction file: A file containing 10MM transactions of the customer
  3.             Use case:        
      1. Load both data file into data frames
      2. Join Both data frames on Unique customer ID
      3. Sum the transaction amount at customer ID level
      4. Display top 20 users who spent the maximum                                                    

The hardware used for this test was a zTrail machine provided by IBM. However, it is not meant for benchmarking.

zAnalytics was able to perform this workload at a rapid pace, in close to three minutes and thirty seconds (3:30).  


Rewrite your workloads in zAnalytics and witness significant reduction in execution time and cost.

For more details contact:





Blog authored by

Ramanath Nayak, Lead Consultant, Legacy Modernization

October 6, 2017

Transform the IBM i-series/AS400 with DevOps: It's possible

i-series is an IBM midrange computer system used extensively by many large insurance firms, retail businesses and banks. Being a legacy platform, projects executed on it generally follow the conventional waterfall development and slow releases.

Today's environment demands agility. Can you transform your i-series and be agile - YES. Have we done it - YES. Let me help you understand with an example.

Here was the problem!

A leading global bank was embarking on a transformation program and wanted to reduce system downtime during scheduled outages of their Core application by routing the requests to a 'stand-by' application to provide business critical services.

The challenge was that the core platform was a 25-year-old monolithic application built on i-series and the entire concept to reality had to be achieved under very stringent timelines. Our team was up for it and we addressed the problem by successful implementation of DevOps & adopting agile principles. This is the first time it has been attempted by any service provider.

How did we address the problem?

Infosys partnered with the client's tooling team and spearheaded the DevOps pipeline implementation.  The deliveries happened in a MVP manner and shippable components were ready after every 2 sprints.

Tools adopted:

  • IBM Rational Developer for i (RDi) - Primary build tool based on the Eclipse platform. It comes with all features supported by any other IDE available in the market and gives a much needed break from the 'Green-Screen'
  • IBM Rational Team Concert (RTC) - Played a dual role of
    - Application Lifecycle Manager: Used for creation of Product Backlog, Sprint Backlogs, Work allocation & collaboration
    - Source Control Manager: Source codes were migrated to the RTC Jazz server and check-out/check-in were performed using RTC. It also allowed concurrent development and facility to auto-merge. 
  • Smartbear Collaborator - Provided the entire review workflow for logging review comments/defects. The reviewer could give in-line comments/log code defects against each line of code and the developer could action the same accordingly.
  • ARCAD Skipper & Deliver - These are two main tools provided by ARCAD for Build & Deploy. These are seamlessly integrated with RDi + RTC; build and deployment can be requested at the click of a button as opposed to the extended conventional release process.
  • ARCAD Observer - Cross-reference tool used to perform detailed impact analysis and understand program flows.
Devops Process and Tooling.png

How did the client realize value from the solution?

  • Increase in productivity from use of DevOps tools
  • Release cycle reduced to 2 sprints only
  • People & process moved from waterfall mode to DevOps mode enabling enterprise agility

And our teams in turn benefitted from:

  • Current staff being trained in open source CI-CD tools
  • Built expertise on Arcad specific tools
  • People & process moved from waterfall mode to DevOps mode

This journey however was not easy and we were faced with multiple challenges.  DevOps tools for i-series were unexplored and we didn't have any SMEs available. Additionally, the application built across geographies added to the complexity making cleaning up of the non-standard source code types a tedious and prolonged activity. The biggest challenge was to change the team's mindset to move out of the conventional way of doing things and step into the new world.

What made this even more difficult was that for stand-by application to perform its required task when the main application is down, it had to be installed on a separate iASP (Independent Auxiliary Storage Pool) from the main application. 

Based on ARCAD consulting, it was suggested to install one ARCAD instance per iASP which would come with a heavy licensing cost.

We did a PoC to move the ARCAD installation into system pool (SYBAS) from the iASP and this instance was accessible across all iASPs. This in turn provided a saving of around 78,000 USD 

It was a challenging task (took us almost 3 months to implement the entire pipeline) but surely not impossible - and made us the pioneers in implementing DevOps on i-series while adopting agile principles. I would love to hear your views and recommendations if you have done similar work at your clients.

Blog authored by
Vivek Narayanan, Sr. Technology Architect

August 7, 2017

Application Portfolio Analysis - The first step to Modernization

Organizations maintain a huge number of legacy systems that store a lot of hidden information and unidentified opportunities, accumulated over time. Owing to stiff competition and changes in customer needs, organization started moving on to adopt next gen architecture such as mobility, Cloud, DevOps etc. But before organizations can embark on digital transformation, they must understand their existing IT landscape. Application Portfolio Analysis helps organizations do just that either through top down approach such as surveys, meetings with SMEs or bottom up approach such as analyzing the inventory through tools or a combination of top down and bottom up approach.

Application portfolio analysis validates every application based on specific parameters of Business Adequacy such as revenue impact, total cost of ownership, end user experience, compliance and regulations, etc. and Technical Maturity such as availability, stability and Complexity. After analyzing the IT landscape, Infosys as a trusted partner, provides its unique 4Q solution i.e. Retire, Rehost, Renew or Reengineer as target disposition for every application.

Retire: If the application doesn't provide any business benefit and has poor technical capabilities, it can be considered as a good candidate for elimination/retirement. Due to compliance regulations, the data will be archived.
Benefits- Cost savings, removal of obsolete applications that reduce the complexity in IT environment.

Rehost: When application has business value but the current IT landscape results in huge costs, customers can migrate the applications on to a lower cost platform (e.g. Cloud) without modifying the existing functionalities.
Benefits- Cost savings, scalability for changing business needs.

Renew: Applications that have high business value and better technical features, requiring no major transformation can be enhanced with enabling web services (API), DevOps, etc.
Benefits- Enhanced customer experience, improved employee productivity.

Rewrite: Applications capable of generating greater business value but restrained by legacy systems in its ability to adapt to growing needs can be reengineered to next gen architecture such as Open Source, Cloud , DevOps, etc.
Benefits- Increase in revenue, agility, faster time to market and reduction in risk associated with aging workforce.

In a nutshell, if an organization decides to plan for major or any transformation, portfolio analysis helps them to avoid the risk of failure and also identify quick win opportunities that can reap immediate benefits

July 4, 2017

Migration of Mainframe Batch Workloads to ETL Platforms

"It's only after you've stepped outside your comfort zone that you begin to change, grow, and transform.
                                                                                                                                          ~ Roy T. Bennett

During the late nineties, technology pundits predicted the imminent downfall of the mainframes. These ginormous computing machines from an earlier age seemed to have outlived their usefulness. Now, two decades later, the Big Iron continues to rule the roost running inside some of the world's biggest companies including banks, retailers, insurance companies and airlines. 92 of the top 100 banks worldwide, 70% of the world's largest retailers, 10 out of 10 of the world's largest insurers and 23 of the world's 25 largest airlines run on mainframes. To paraphrase Mark Twain, news of the mainframe's demise was greatly exaggerated.

The Second Law of Thermodynamics states that the total entropy of an isolated system always increases over time. An IT system is no different. Most of the mainframes systems have been around for several decades and one of the most taxing challenges these applications face today is the muddle which has developed over years of operation. Bit by bit the complexity of the systems increased to such an extent that maintainability has become a huge challenge. Consequently, mainframes have become expensive to maintain and run.

IT organizations have a plethora of options where mainframe modernization is concerned. Every mainframe landscape is unique and the ability to create a foolproof modernization roadmap lies in the ability to understand the context of the mainframe landscape. This includes consideration of the application complexity, current spend, integration points, technology landscape, variety of processes as well as its adaptability to the proposed target framework. Owing to the similarity in their concepts, processing and terminologies, the ETL platform has evolved as a good alternative for running existing batch workloads migrated from a mainframe environment.

Migration to ETL involves extraction of business rules from the existing application and using them to create specifications or code for the new system. This is an excellent way of taking advantage of existing business logic within the mainframe system while introducing modern technologies and IT concepts. The rewritten IT systems can quickly adapt to changing markets, shifting customer needs and new business opportunities. But at the same time, the migration process can be time consuming and adds project risk. Following few strategies can help ensure a smoother transition:
  • Phased migration approach with clear fallback strategy is vital to ensure there is no adverse impacts to consumers of the IT system as part of this migration.
  • Outbound extracts can be the first to be targeted for migration. While data continues to be consumed from the legacy platform, the files generated on the ETL platform can be vetted against the files on the legacy platform using a generic comparator framework and flows cut-over once all certification activities are complete.
  • Inbound feeds are generally tricky as it involves maintenance of the database tables and any discrepancies can result in data corruption. For ensuring the processed data is unchanged in the new platform, the load ready files can be compared over a period of time before the flows are actually cut-over to the new platform.
  • Internal processes like housekeeping, purge, etc. can be the next to be rewritten on the ETL platform.
  • In many cases, the database continues to remain on mainframe. When feasible, moving the database to a distributed database can be considered as the next logical step.

Batch to ETL.png

Some of the key challenges faced during typical ETL migration projects are listed below:
  • Mainframe application complexity due to the extensions and additions done to the underlying architecture over a span of several decades. Mergers and acquisitions further add to the confusion with disparate systems, standards and practices.
  • Meeting existing SLAs is a key parameter for any successful ETL migration. While mainframe load utilities working at partition level can easily load several hundred million records on a daily basis, it may be a challenge for ETL processes.
  • Loss of subject matter expertise in the application is another common concern as mainframe experts move to other areas in the enterprise with mainframe stack being replaced by the ETL suite.

In a mission critical software life cycle, complexity is a given and quality is a requirement. ETL combines data preparation, data integration and data transformation into a single integrated platform to transform how IT and business can turn data into insight. ETL processes inherently lend themselves to automation and this brings a swarm of benefits to these processes including end to end data lineage, faster delivery times, improved productivity, reduced cost, decreased risk of error, and higher levels of data quality among others. With improved integration of ETL and Big Data technologies, it's now possible to augment data with Hadoop-based analytics. Features like data connectivity, transformation, cleansing, enhancement and data delivery can be run within a YARN-managed Hadoop platform. 

Infosys has a proven track record in collaborating with clients to migrate their mainframe workloads to ETL platforms. For example, Infosys has partnered with a leading US bank to ensure non-disruptive migration of their batch processing across several operational and analytical applications to an ETL based Data Integration platform featuring reusable frameworks such as generation of Gold Standard files, Common Maintenance frameworks, and Data Quality and Governance frameworks.

Is your mainframe systems and workflows limiting your agility? Consider migration to ETL platform. It might just be the right solution for you.

Blog authored by 
Ramakant Pradhan, Ramkumar Nottath and Arunshankar Arjunana

April 19, 2017

Better Managing Your Modernization Process

There is much that has been debated about legacy modernization, yet this topic continues to stay relevant as enterprises are increasingly challenged by a new, digital, and highly competitive business environment.  The impetus for modernization is no longer borne from a need to reduce costs or upgrade to new technologies, but from a more compelling need to prepare for changing business requirements and future readiness.


Today, technology has become a core aspect of every enterprise, from IT companies to retailers, and is directly shaping business strategy. In preparation for this paradigm shift, organizations can either build new technological capabilities within by hiring required skills or partner with an organization who has the expertise to guide them through the process. Both methods have their pros and cons, though with shrinking time-to-market cycles, the later may seem preferable. Interestingly, Infosys DevOps for Mainframe leverage tools by IBM and Microfocus and created a PoC for a global bank. The tools enabled the bank to save approximately 15% of build effort and reduced storage by approximately 25%.

Legacy modernization to improve user experience of internal stakeholders

With technology changing at a continuous pace, enterprises need to prepare for stakeholders that are not only external, by way of customers, but internal, by way of employees, as well.

With internal stakeholders undoubtedly exposed to smarter technology and smoother user-friendly UX through apps, legacy technology and systems can be the cause of strain as it involves working with manual, time consuming systems.  Thus organizations find themselves at an inflection point when it comes to delivering uniform, high-quality services and experience, across stakeholders. A case in point is how the mainframe modernization team at Infosys helped a large Australian bank decommission complex legacy systems and reclaimed approximately two million AUD.

Technology can make IT a revenue centre

As technology begins to play a more pervasive and decisive role in an organization, technologist and CIOs and going to increasingly have a voice in determining the direction of the organization as their opinions are going to have a direct impact on revenues. IT will no longer be a cost center but a crucial value add, and become the tool that enables organizations to work faster and smarter. The challenge for IT of course will be on how to collaborate with business and develop roadmaps that enable seamless, non-disruptive modernization of legacy systems. This can be a significant challenge as these vast legacy systems have been developed over many years, in a fragmented manner and most organizations do not have a comprehensive understanding of their IT landscape.  Fear of their unknown IT landscape often prevents many organizations from beginning the modernization journey. 

Steps towards effective legacy modernization

Thus when an organization decides to begin modernizing its IT infrastructure, the first thing would be for it to map its IT landscape Organizations need to create a roadmap on what needs to be modernized and set realistic timeframes. Organizations also need to be cognizant of the fact that the pace of modernization is often determined by external forces like competition and the disruption that the industry is going through. For example at a large fashion retailer the landscape is being redefined completely by looking at what functionality will reside where.

A second aspect of legacy modernization is the need for an organization to consciously undergo a mindset and culture change. CIOs need to be closely invested in bringing about this change and there needs to be a buy-in from business leaders to make this top-down process a success. Processes in organizations should support this change - from recruitment to training to performance management.

A third aspect of modernization is risk mitigation. Since modernization is a journey that can take two or three years, organizations need to engage in this process without impacting their customers or internal stakeholders. Additionally, organizations need to be willing to invest and allocate an annual budget towards optimization of critical applications where possible, and legacy modernization where necessary. Budgets are also required to acquire new skill sets that will make this possible, as well.

The fourth aspect is how we can leverage technology advantageously. Cloud and digital are the key destinations of change for all industries. Organizations need to identify applications that can move to the cloud, and the extent to which they can leverage mobile and open source. An organizations business and technical landscape is a key determinant here, and the level of disruption facing that particular industry.

It could start with rehosting applications on the cloud to accelerate performance. We could renew legacy applications by the effective use of DevOps, creating APIs, extreme deconstruction and this transition to a new user experiences.

Leading US based bank improved its business turnover by exposing legacy assets as SOAP (Simple Object Access Protocol) from multiplatform integration. This enables cross selling as a direct business benefit.

Migrating mainframe batch jobs to Hadoop helped an European CPG company reduce the time it took to reconcile financial statements by 40%.

In summary, mainframe modernization is real and now. How much and at what pace is specific to a client. We have the expertise and technology available today to rapidly move to more open business models and technology architecture.

April 5, 2017

Legacy Decommissioning (2/2)- Data Archival and Retrieval

In continuation to the earlier blog of decommissioning legacy systems where I explained the overall decommissioning process, this blog will address the data archival and retrieval approach in the design and execution phase of the legacy decommissioning.

As discussed in Blog 1, many organizations feel forced to keep aging legacy applications running, way beyond their life because they contain critical historical data that must stay accessible. This information may be needed for customer service or other operational reasons, or to comply with industry regulations. Yet keeping obsolete systems alive just to view the data, puts a real strain on resources. These applications steadily consume IT budget in areas such as maintenance charges, staffing and data center costs and in many cases it is over 50% of overall IT budget.

Data Archival during application decommissioning is the most cost-effective and simplest solution for keeping legacy data accessible for continuation of the business. Archiving complete legacy data at once is one of the best practice suggested during application decommissioning. This archived data can then be swiftly accessed online and can provide different views of the data for analytics or can be exported into different format when needed. During data archiving the data can be extracted from any legacy system and store it in a secure, centralize online archive. The data is easily accessible to end users, either from screens that mimic the original application or in a new format chosen by the business. The new infrastructure built for legacy data archival should be capable of moving all legacy data components, whether it is structured, unstructured, semi-structured, or even raw metadata, into a comprehensive online cloud based self-managed centralized repository. Infosys has proven methodology for archiving the data in this central repository.

Application decommissioning and data archival requirement are unique for every organization, Infosys with its proven framework and tools can help customers with below:-

1.       Building a strong understanding of the current application data model
2.       Building data retention policies and retention requirements through strong domain knowledge.
3.       Building the innovative archival data model.

The above approach is largely adopted for mainframe legacy applications which had accumulated large volumes of data over many years in form of documents and images. Typical examples are

1.       Billing records,
2.       Financial transactions
3.       Customer history. Etc.


Data Archival

The generic framework for a Data Archival process consists of following steps

* Data Extraction - Collect and extract the data from source data base into interim storage area i.e. Staging Area and perform data transformation where required to map the data format of the Target state e.g. EBCIDIC to ASCII, Alphanumeric to Numeric, Date format changes etc.

* Validation and Cleansing - Validate the schema of the target database such as Tables with all the constraints, indexes, views, Users along with their roles and Privileges are migrated as defined in business rules. Validate the contents of the data migrated to confirm the referential integrity is maintained in the target definition. If required, data cleaning is also performed for generation of golden records, removal of duplicate records, Cleansing of special char, spaces and emails.

* Transformation - Transform the data from the source to the target as defined in data mapping rules and lookup tables.

* Data Migration - Load the data into the target database using the data loader utilities / scripts and programs generated for loading the incremental data, multi lingual data and recovery of failure data.

Data archival solution designing for extracting and migrating the identified data source is usually done collaboratively with customer DBA's and SME's. Complete analysis of the application is done to identify the following information:

* Understand the current data Model in the legacy application

* Identify the unknown data relationships in current model.

* Creating retention policies of the data identified for data archival

* Data extraction with required transformation in the application context

* Identifying the key fields for indexing to search the required data efficiently

* Validating the target database schema and data contends of the archived data.

* Creating interface to access archived data and reports independent of the application

* Applying the application, entity, records level retention policies based on organizational requirements.

Based on above analysis the target data model will be generated and each target entity will be identified as business object and will have the data for the object. The business object is defined keeping the regulatory and business needs into consideration.

Based on organizations requirements and details collected during impact analysis, Infosys will suggest best approach to archive the data by evaluating multiple factors such as quantity of data to be archived and actual requirement of the data archival, two options are

1.       Complete data archival process at once
2.       Partial data archival over through multiple releases.

Data Retrieval

Data retrieval is to ensure the archived data is accessible anytime for the audits and analytics for normal business operations. Also the data is secured and can be accessed as per the user roles and privileges. Two most common strategies are Hot retrieval and Cold retrieval of the data. In Hot retrieval data is accessed immediately based of few keywords. In Cold retrieval data is accessed via reports and service request when specific view of data is needed.

Following are few ways through which we can retrieve the archived data:-

      • Data retrieval through keyword search and business objects to give full application context.
      • Custom generated reports using Data integration capabilities.
      • Data retrieval using standard interfaces to the databases such as ODBC/JDBC and SQL
      • Using enterprise reporting tools such as Crystal Reports
      • Data archival and retrieval using third party products such as IBM Optim, Infomatica etc.

March 30, 2017

Mainframe Legacy Decommissioning(1/2)

In the new era of digital transformation, new strategies and technologies are taking over the legacy applications to run the day to day business. As a result of new technologies driving in, when new digital next generation aligned applications are deployed to production, many of the mainframe legacy applications become redundant .The business value of these mainframe legacy applications decreases with every passing year, but organizations often continue to support them for the purposes of data access and compliance. The cost of doing so can consume over 50 percent of an overall IT budget. Decommissioning will enable organizations in eliminating maintenance costs, associated hardware and software costs of these outdated legacy systems.


Application decommissioning is all about isolating the application by migrating the database to archival system, removing the interfaces for upstream and downstream systems and detaching the allocated hardware from its active use, while the business-critical data is still available to the application as is and when required. Typical examples/drivers of decommissioning include:

·         Retiring an old application, which has more functional and technical capable alternative system.

  •          Consolidation of one or more in-house applications or COTS products into a single enterprise solution such as ERP, CRM etc.
  •         Elimination of redundant applications within the Enterprise landscape
  •         Mergers and acquisitions of the organizations.


Benefits of Decommissioning:

By decommissioning outdated, redundant, high cost and low value applications organizations can achieve tangible benefits as below :-

  •          Reduction in maintenance costs by removing unwanted hardware, outdated software and human resources.
  •         Financing new business initiative or projects  by utilizing unconstrained resources from decommissioning projects
  •         Expenditure reduction on licenses and infrastructure of non-scaleble technologies.  
  •         Reducing operational risks by simplifying and aligning the enterprise to next Gen Architecture.


All data from a proposed decommissioned legacy system or application may or may not be migrated / archived to the newer replacement system. This depends on the driving factors of application chosen for decommissioning, business criticality and technical adequacy of the system and the impact of the decommissioning on existing enterprise architecture and business processes. I will cover the data migration and archival options in the separate blog.

Application decommissioning best practices suggest archiving all of the data at once while maintaining online access to the data through a preferred reporting tool.

Infosys application decommissioning methodology typically involves following four steps:

1. Portfolio Assessment

2. Detailed Impact Assessment

3. Solution Planning

4. Actual Execution and realization.


Step 1: Portfolio Assessment

In the portfolio assessment phase, the entire enterprise landscape is assessed for each application's life-cycle and identify which applications to maintain, which to invest in, which to replace and which to decommission.

Using sophisticated questionnaire with individual scores mapped to each of the influencing area, the applications are assessed for its Technical maturity, Business criticality along with its Strategic Fitness on organizational road-map and TCO. This is top down interview based approach and by interviewing different Business owners, IT owners and Business Analysts, the application is assessed for its impact on organization, customer service, dependencies on other applications to see if it's a potential candidates for decommissioning.


Step 2: Detailed Impact Assessment

In this phase, we assess the impact of decommissioning on various external and internal interfaces of each application through in-depth analysis.

The impact is assessed by detailed tool based inventory analysis and considering the impact of application on various areas like on access, business, benefits, etc. Infosys has the right set of tools (Combination of In-house and third party tools) viz. Legacy Analyzer, Microfocus Enterprise Analyzer or Averisoruce, in this area which can help and detect the application dependencies, Limitation and risks associated with application and Maintainability index.

The outcome of this step will help us arrive at the final list of applications which will be considered for decommissioning.


Step 3: Solution Planning

The objective of this phase is to identify

  •        Possible solutions for decommissioning the identified applications
  •        Choose the best solution for data archival ( On Premises  Vs Cloud) and best data retrieval strategy (Cold and Hot Retrievals)
  •        Identify all the application technical components like programs, JCL, Databases, Online entries, Scheduler entries, infrastructure used by the application, different access levels required for different users along with all the interfaces and business processes. Raise a change request for the actual decommissioning solution in the execution step. The change request will have all the details of all associated hardware and software's to retire after decommissioning. 

Typical decommissioning solution is prepared from application usage and detailed parameters gathered during portfolio assessment and detailed impact analysis phase. These parameters are

  •         Managing application access for various users and partners
  •         Managing application upstream and downstream interfaces to ensure landscape collaboration
  •         Managing the data exchanges, decision making for archive or dispose
  •        Managing released infrastructure components used by the application to be decommissioned


Step 4: Actual Execution and Realization

In the execution phase, the identified decommissioning solution is implemented for successful completion of the project. The decommissioning of the applications is executed as per the project plan prepared under Solution planning and reports are generated for each component of solution.

Below are some reports generated during the execution phase and are validated after implementation:-

  •         Impact Analysis Report.
  •        Data Scan Report
  •        Data Archiving/Migration/Integration Reports

More coming in my next blog.

March 1, 2017

Legacy Modernization - Where do I begin?

Many enterprises in different industry verticals have made huge investments in the legacy systems in the past - which are now leading to operational inefficiencies with the increase in their business growth. These legacy systems inhibit the organizations from embracing next-generation technology that will enable business innovation. Many firms have looked at these investments as additional expenses but modernization is the key differentiator to capture market share and stay atop among the competitors. 

How to do more with less?
As a partner, it is important for vendors to understand the modern technology trends and evaluate, how can it help transform the enterprise and prepare it for the future. The below 5-step strategy will provide a simplified approach towards modernization of application and systems within the enterprise:

5- Step Strategy for IT Modernization:
  1. Identify the key business goals of the enterprise.
  2. Identify the barriers & challenges across IT systems and its impact to the business
  3. List out the key Modernization themes based on the gap between business and IT
  4. Lay out the Strategic Solutions to modernize the platform
  5. Choose the best suited solution and define the transition road map to future state
The 5-step strategy works. A case in point
As a strategic partner, we were part of a modernization initiative for one of the leading insurance brokers. The 5-Step strategy greatly helped us to arrive at the modernization solutions that helped transform their business. 

Let me explain how this 5-step strategy was applied to simplify the client's landscape modernization and achieve business innovation.

1. Identify the key business goals of the enterprise
A combination of interviews & questionnaires with all the key stakeholders of the enterprise helped us understand their vision and arrive at their business goals as listed below.
  • Reduce Total Cost of Ownership
  • Better Return on Investment
  • Enhance Operation Efficiency
  • Faster Time to Market
  • Increase Global Footprint
  • Gain Agility
  • Enrich User experience
  • Scale for Future
2. Identify the barriers & challenges across IT systems and its impact to the business  
It is important to comprehend the challenges & the gaps in the existing landscape that will lead us to find the right opportunities for investment.
  • Understand the business and IT constraints
  • Perform portfolio assessment of different applications within the landscape
  • Document the key challenges across different applications 
  • Identify the impact to the business
An assessment of the IT landscape of the insurance broking company was carried out, which helped in identifying the key challenges across different portfolios of the enterprise and the business impact of such concerns to the enterprise, which is depicted in the below diagram:

1. IT Landscape assesment.png

3. List out the key Modernization themes based on the gap between business and IT  
Bridging the gap between business and IT is essential to the success of a transformation or modernization initiative in an enterprise. A holistic view of the organization, an understanding of their business expectations and IT strategy helped is in deriving the key modernization themes that will help in delivering the desired business outcome.

4. Lay out the Strategic Solutions to modernize the platform  
We arrived at different strategic solution options based on the identified modernization themes. It is to be noted that the strategic solutions described below may only suit the challenges associated with this case study. The strategic solutions will have to be tailored for the specific needs of the enterprise.

2. Strategic Solutions.png

5. Choose the best suited solution and define the transition roadmap to future state
The strategic solutions arrived at step-4 will have to be well-thought across different aspects like cost, ROI (Return on investment), business benefits etc. These solutions were compared against different factors like cost & business benefits that they offer. A quick comparison of different solution options against these aspects helped the clients make the right choice.

3. Mod Options.png

In this case study, the client chose the 'Rationalize' option - where they wanted to prioritize their lines of business that will have to undergo modernization. The client was also keen on reusing their existing assets as much as possible. Accordingly, the road map was defined, where-in the modernization was planned in different phases.

The best suited solution should be chosen for the enterprise such that it improves the corporate agility and competitiveness.  Future state roadmap is to be defined and the transition architectures will have to be detailed to transform the baseline architecture to target architecture in phases.          

Having an appropriate strategy for IT modernization is imperative to the success of an enterprise modernization. The CIOs will have to make conscious decisions in terms of investments for transforming their IT systems in the most efficient and cost effective manner. It is important to ensure that IT modernization strategy is in alignment with the overall business strategy and the vision of the enterprise. The 5-Step modernization strategy detailed in this article will help the CIOs, business & IT directors and enterprise architects optimize their business & gain competitive advantage.  

Authored by:
Nandha Venkateswaran
Senior Technology Architect, Infosys