Commentaries and insightful analyses on the world of finance, technology and IT.

April 7, 2015

Belling The Cat - Addressing a dilemma faced by Investment Bank Risk Officers

Compliance in the trading world is more a topic for frequent discussion than proactive action. In case of investment banks, which significantly influence industry behavior, compliance is viewed more as a back-office function and hence, a cost centre. And when compared to profit centres such as frontline SBUs, the reputation of this function is a dampener for growth.

Though not publicly accepted, investment banks view many regulatory initiatives with apathy - just do what is required. Complying with the letter is more important than the spirit. In fact, if the front-office focuses on building flexible, real-time systems which are in line with stringent norms of microsecond advantage using superior process definition, technology and talent, regulatory compliance applications are built using batch processing legacy technologies. And the development team considers such assignments more as punishment posting than an elevation. In essence, effort is directed more towards 'complying'.

Although trading is supervised well from the perspective of 'front running' and helping their counter-party trader (prevalent in bond trading desks) within the investment banking, less has been achieved on employees' personal trading for a set of assets classes or individual securities. Over the years, there has been little progress on building a proactive mechanism of monitoring, reporting and possibly restricting personal trading. One reason is that monitoring personal trading significantly depends on manual processes such as paper submission using spreadsheets and investment banks lack the wherewithal to file personal trading details during an audit process. Where it exists, the process of filing external regulatory reports on employee personal trading is riddled with delays and inaccurate data attributes.

Within investment banks, employee personal trading falls under two categories - noncore and privileged. Noncore employees may not have access to privileged information and may fall in the outer layer. Privileged employees have privileged access to information related to the material interest of the bank. Such information has leveraging potential from the personal gain perspective. And while there are laws in place to closely monitor these conflicts of interests, many of the disputes between SEC / FSA and investment banks involve individual interests. These experiences form the basis for arguments to separate the research department from the investment banks.

Within an investment bank, the personal trading compliance process follows four distinctive phases:

1. Restricted list watch: Restricted list of securities are predefined and broadly communicated to employees who matter from the perspective of privileged access. This list restricts employees trading in securities where the bank has built the holding and calls for mandatory disclosure. Depending on the holding percentage which may vary from country to country, it is the bank's responsibility to maintain and update the restricted list to avoid any conflict of interest. FSA in the UK, as per rule number 7.3, checks for possible conflict of interest which includes front running (staff deals ahead of investors in the securities based on privileged access). Similarly, SEC 17j-1, rule 204-A-1 calls for the employee to obtain a duplicate brokerage statement and submit it to their employer bank.

2. Pre trade clearance: Banks have a list of 'what not to buy'. However, pre trade clearances are obtained through e-mails or by signing paper documents and often, this is done post the trade. The delay in correspondence between the risk office and an employee often results in a breach of code of conduct. Eventually, these breaches find their way into audit reports and draw the attention of regulators.

3. Broker confirmation: Though many employees diligently submit their confirmation duplicate to the bank, it is generally filed with the individual employee's records. There is hardly any automated process to reconcile the various broker confirmation receipts that an employee files from time to time. Tracing back to the point of any breach of trust is not only time consuming but also manual which means there is scope for human error.

4. Documentation: This is one of the weakest links in the chain. Poor documentation of an employee's personal trading history affect the firm's ability to pin point where the blame lies. From a compliance perspective, gathering information from various sources to synthesize and then arrive at a meaningful conclusion is still challenging.

Emerging regulations across the globe clamor for a different approach. Considering the external stimulation and more awareness on the need for better conduct internally, investment banks are looking for solutions that will enable them to stay informed and track employee personal trading to the spirit of the laws rather than the letters. Essentially, this requires behavioral changes at an employee level. But automating the process of gathering and creating reports on personal trading compliance would reduce the number of questions raised by the auditor in the short-term, and help in brand building in the long-term.


April 6, 2015

Is Big Data Ready For Consumer Banking?

Is big data just a buzzword?

Big data has been a popular buzzword in the banking industry for some time. Banks that are always on the forefront of technological innovation have long recognized the need for harnessing the information captured daily through hundreds and millions of customer transactions and interactions. As competition becomes intense and need for customer engagement becomes the bedrock for sustainability, banks are desperately looking for help from technology to extract maximum value from their core data assets.

Over the past decade, banks have closely observed the development and successful deployment of big data solutions by new-age enterprises like  Google, Amazon, Facebook, and Linkedin, enabling them to provide highly personalized and immersive user experience. Banks have waited for this technology to mature and become commercially available to take it to the next frontier of innovation in the financial industry. So is big data now ready to meet expectations of the banking industry?

 Can big data scale up to meet expectations from banks?

Let's look at key challenges faced by banks today.

1. More regulations mean banks need to store more data for a longer period of time. Banks have a problem with the archival and timely retrieval of this data that sometimes runs into terabytes. Big data provides a cost-efficient and scalable solution of storing these terabytes, or if needed even petabytes of data in Hadoop File Systems (HDFS), distributing the data across multiple commodity hardware. The Hadoop-based storage solution is horizontally scalable and many banks have already implemented this solution.

Industry news: Morgan Stanley, with assets worth US$300 billion, has started with a 15-node Hadoop cluster that the enterprise is planning to grow.

 2.  Another problem faced by most banks is the existence of data silos. Even though most banks have enterprise data warehouses (EDWs) they are expensive and don't allow the flexibility to make modifications easily. One of the fast emerging use of big data is the concept of the data lake or the logical data warehouse. The data lake acts as an enterprise repository to store data of any format, schema, and type.  It is quite inexpensive and is massively scalable solution for enterprise data needs.

The data lake can support the following capabilities:
a) Capture and store high volume of raw data across the enterprise at a fairly low cost
b) Store variety of data types in the same repository
c) Provide the ability for schema definition on read enabling generic structure for data storage

With information being available in a single place, banks can leverage association and predictive techniques on this data to generate insights about customer behavior, churn, and identify cross-selling opportunities.

To overcome the technical complexity of retrieving information from data lake, Hadoop has introduced Pig and Hive. Hive provides an SQL-like interface to the data stored in HDFS while Pig provides a high-level platform for creating MapReduce programs to process data stored in HDFS.

Industry news: HSBC implemented a Hadoop-based data lake platform to support their ongoing and future regulatory needs, thus eliminating restrictions related to data availability.

 3. The techniques described earlier process data in batches but in banking a lot of functionalities require high throughput of data. To solve this problem Apache developed Cassandra - a fully scalable distributed database system with high throughput. Many companies have benefitted from successful deployment of Apache Cassandra. The benefits include enterprises being able to identify fraudulent transactions or determine suitable offers for customer at real-time. 

Industry news: Real-time offers through online channels needed a high throughput database. Bank of America supports this high volume and high throughput data through Cassandra.


4. Big data is associated with two important capabilities - storing high data volume and generating insights. Thus, it is not only important to store these petabytes of data but also derive key business intelligence at real-time.

Apache Mahout is a library of scalable machine-learning algorithms, implemented on top of Apache Hadoop using the MapReduce paradigm. Banks can use Mahout on a huge amount of customer information stored in HDFS to have a customer 360˚ view and provide need-based customer offers.

Apache Spark provides similar functionalities in real-time as it runs in-memory in clusters. Spark analyzes data at real-time to generate time-sensitive business intelligence; for e.g., identifying fraud based on outlier behavior pattern or providing location-based offers.

Industry news: Deutsche Bank has recently implemented Apache Spark to support its real-time data needs for fraud detection.

Can banks afford to ignore big data?

We are witnessing that big data platforms are maturing rapidly to meet the demands of the financial industry. Tools are becoming less complex, reducing learning curve and resulting in the availability of more skilled personnel.

As most of these tools become commercially available, this is an ideal time for banks to invest in big data and set up the right platforms. If not, they may have to play catch-up as other industries surge ahead with the knowledge and use of big data platforms.


Discover how Infosys can transform the way you do business>>


March 27, 2015

How to comply with new regulations such as fast pay-out? - A banker's dilemma

The new guidelines on fast pay-out issued by the financial regulators of FSA, are changing the perspective of banks and other deposit-accepting financial institutions on achieving consolidated and comprehensive views on their customers and their activities irrespective of their touch points. These guidelines and the consultative framework, which FSA is building, will significantly speed up processing of claims of the depositors. The consultation prescribes a mandatory period of seven days to process the claims and settle them. An important element of the proposals is the introduction of a clause requiring the banks to be able to furnish a Single Customer View (SCV) to ensure that they are in a position to provide the aggregate balance held by each eligible depositor (FSA UK).


In its attempt to solve the depositors' difficulties to gain their money back from the banks as well as to assure that bank will not have a run on them, FSA is asking a few fundamental questions to banks. These questions can be summarised as:


a. How do banks store and retrieve all their customer information?
b. Are systems and applications that the banks have built over a period capable of extracting vast amounts of data attributes to create meaningful information?
c. Can the banking and other financial organisations realistically establish a relationship between depositors A and B when they are the same or are interconnected with transactions?
d. How do banks manage their customer information particularly in the context of mergers and acquisitions?
e. Can the two systems - acquired and acquirer bank be integrated in a way that enables single view of their customers and their activities?


Traditionally, banks have organized themselves in silos created on the basis of products/services or geographical processes. Further innovation in products has made it difficult to share customer information between different SBUs seamlessly. In addition to this, disparate systems exist between different divisions of the banks, making it all the more difficult to extract the information in real-time basis to understand depositors' exposure to the banks.  Though in the last few years banks have spent significant effort and money to implement robust CRM systems and other applications such as KYC to meet internal and external compliance matters, a comprehensive view providing a greater depth of knowledge about their customers is far from reality. The key stumbling blocks to achieving single views on customer data between different products and service lines include the lack of an information bridge between business architecture and technology architecture and the difficulties in building common symbology across source systems.


Historically, organisations have approached the solutions from the perspective of building large data-warehouses. This approach of building large-scale databases to load customer information, analyse them through data marts and data processing applications were built as additional layers to create meaningful reports and views about customers and their activities. However, issues such as duplications, re-creation of customer data in addition to effort, the requirement to maintain structured and unstructured data along with real-time update of changes, have limited the benefits of these built data-warehouses.


To be in a position to meet the FSA's deadline, banks now need to relook at their entire IT landscape. Sooner or later, IT management of these banks have to take a deeper look on the multiple databases they have built over a period of time to maintain and manage their customers across the globe. They need to be in a position to seamlessly distribute and redistribute information as, when, and where needed. To comply with FSA, banks need to initiate a few first but important steps.


Step 1: Build an enterprise wide roadmap for master data quality: It is a known fact in the industry that in large organizations, there are multiple formats and versions of master data. Having a defined view on how customer-related information will be captured and maintained is the first foundation stone. De-duplication of customer information and building a standardised format through which, customer information acquisition can happen is important and critical for the intended strategy of building a SCV.


Step 2: Build an Information Architecture: Within banking organisations, different business and technology architectures exist. The missing link has been the lack of a clear vision on building a unified information architecture. Defining the process for building a common symbology to serve as a single-source for cross-reference is critical. This will not only help in seamless update of all downstream systems but will also play a significant role in how information is received from upstream systems without any manual, intervention-based data cleansing effort. Limiting manual intervention can significantly reduce errors that typically occur during the creation of customer information.

 
Step 3: Define the view on solution choice: data warehouses and SOA both provide ways to achieve the single view on customers. Depending on the number of source-systems, data volume, and integration complexities, an organisation needs to have a clear perspective on the solution, which not only caters to the current needs but also addresses the needs of the business and customer growth in the foreseeable future. If the choice is to build a large data warehouse, it is important to understand how a single update of the other databases that store customer information can also happen.


In a nutshell, there is no silver bullet to addressing these requirements.  In order to optimise the tools and technologies to ensure that organisations have a single view on customers, they need to consider cost-effectiveness, flexibility, and analytical requirements. Building a single view on customers will help organisations benefit beyond regulatory compliance requirements. Finally, it is the deep understanding of the customer, which differentiates and propels competitive advantage for organisations.

 

March 19, 2015

Are The Clouds Over Core Moving?

Are FSIs still slow and hesitant in looking for core banking solutions on the cloud?

Operational risk is a major issue that inhibits companies from moving to a cloud business model for core banking. It was once believed that FSIs would never move their core systems and applications to a public cloud infrastructure or purchase core services under a public cloud, software-as-a-service (SaaS) model. IT adoption is following a familiar pattern of embracing new technologies, leapfrogging developed economies, and their legacy systems.

Here are some success stories:
Microsoft and Temenos launched a cloud-based, pay-per-use core banking platform under a cloud-based delivery model and pricing approach that had 12 Mexican banks as their first customers. PNC bank is on its path of modernizing the legacy of its core banking systems. They want to rationalize and simplify their legacy core applications, reduce the time to market (TTM) innovative products and services vis-à-vis their peers, and prepare for hosted and cloud computing solutions.
It would be impossible for banking industry to adapt to cloud based solutions without some common standards. These standards will help integration of different services t from and to the cloud interoperable. The cloud solutions are going to throw open and allow multiple options. Then the big, enterprise wide solutions are slowly going to become a thing of the past. (When we say cloud here, we mean private, internally hosted cloud services and these are not public cloud offerings like those offered by Google, Amazon and Rackspace.)

The banking industry architecture network (BIAN) is not going to help banks to make or manage their private clouds or their SaaS applications. Yet, BIAN could be the best catalyst to help the entire banking industry gear up to become cloud-ready.
There are still many hurdles for financial services industry specific SaaS deployments and services. Issues such as privacy and safeguarding business secrets coupled with the larger problem of non-availability of specific appropriate financial services and some particular SaaS offerings are preventing banks from taking the cloud adoption route. BIAN has multiple a components to help them create a value in the SaaS space. BIAN's idea of cloud-enabling the banking industry will become a reality once these hurdles are removed.

In general while looking for cloud as a solution, Banks and financial institutions are placing transparency, robust auditing controls and better data encryption mechanisms on top of the list when it comes to expectations from their cloud service providers. When we try to understand why Banks are going to the cloud, flexible infrastructure capacity and reduced time for provisioning were listed as top objectives. Customer relationship management and application development are the top services being adopted by banks for moving to the cloud.

So, Clouds over core banking are clearing slowly and banks need to gear up and get ready.


Discover how Infosys can transform the way you do business>>




March 13, 2015

Overcoming the cloudy concerns: Recommendations for banks

-Anjani Kumar

In my previous blog, I had posited that banks should not let their concerns deter them from leveraging and reaping cloud computing's immense benefits. That said, banks need to adopt a structured approach towards their cloud implementation. In my view, there are four essential ingredients of a well thought-out cloud adoption approach:

1.Choose the cloud model judiciously: No single cloud model (public, private) can meet all of a bank's requirements. Hence, while choosing the cloud models, banks should consider the regulatory, security, cost efficiency, operational agility, and scalability aspects of the model. In the initial implementation stages, banks can plan to have a federated ecosystem comprising a combination of cloud-based and on-premise application portfolio mix. Such a federated ecosystem will allow banks to have myriad cloud models (private, public and hybrid) implementation and flexible capacity for incremental adoption.

Depending upon their business needs, banks can opt for large-scale hybrid cloud model which comprises a combination of public and private cloud features. In this, the computing resources and capabilities are owned by both the bank and the cloud service provider. It allows banks to reap the benefits of optimization offered by cloud, and also ensures high-level of data confidentiality and security. Public cloud capabilities of the hybrid model could be used for general computing, while sensitive data and functions could be enabled in the private cloud. Similarly, for core banking aspects, and also for cases where regulations prohibit processing and storing customer data outside the country, private cloud could be leveraged. An example of private cloud adoption of Westpac New Zealand which recently opted for IBM's private cloud technology to become the country's leading digital bank. The bank will migrate some of its business-critical applications to IBM's Auckland based datacentre.

2.Avoid the big-bang approach: Banks should develop a business case for their cloud adoption and take an evolutionary adoption approach. A mid to long-term roadmap for cloud migration is crucial. Starting with small and less mission-critical legacy applications that have already been architected for meeting external integration and security challenges is the way to go. Also, the relative data importance vis-a-vis the regulatory requirements of data privacy and residency should govern the adoption prioritization. Cloud migration strategy should take into consideration the systems' integration (batch, real-time, etc.) and performance requirements. Business domain wise, lower risk projects such as ECM, CRM, collaboration and workspace are good candidates to begin with. Payments and corporate banking functions such as credit risk simulations, payment settlement, corporate actions, etc. are also well suited for the cloud. In collaboration and workspace, cloud (public or hybrid) can be leveraged for back-office and horizontal processes such as email, internal collaboration, knowledge sharing, etc. UBS has leveraged Oracle's cloud-based Fusion HCM to support its HR function. Similarly, BBVA's entire workforce is enabled through the cloud email and collaboration suite (Google Apps). In content management, Barclays' private cloud-based service named "Cloud It" provides a cloud-based document management system for customers to store their personal documents.

3.Focus on security: Banks should clearly understand and comply with cloud related data confidentiality and regulatory requirements. For instance, regulators such as FINRA may want to audit the bank's cloud architecture. Depending upon the local regulatory needs, many banks may have to keep sensitive data (e.g., customer details) within firewalls and in private cloud. Amazon Web Services has launched AWS GovCloud to allow the U.S. government agencies and contractors to move their sensitive workloads into the cloud by taking care of their specific compliance and regulatory requirements. IT teams should thoroughly test all systems to be enabled on cloud for strong data and application security, performance, regulatory, business continuity, disaster recovery and risk management aspects. Cloud security should integrate well with the bank's existing security processes and platforms. A secure, sophisticated and easy-to-use remote access management solution for cloud which can support all operating systems is desirable.

4.Engage in partnership: Banks should engage a leading cloud solution provider to gain expertise and ensure compliance. Cloud service providers can also be engaged to educate regulators on cloud capabilities concerning data security, residency and privacy. Chosen cloud services providers should have clearly defined strategy, demonstrable ROI and proven capabilities. Banks should get all key information from the providers upfront; including the costs and other implications of migrating the existing infrastructure and applications to the cloud. Banks should also examine the service providers' external security and audits certifications before engaging them. Service providers' performance vis-à-vis transaction volumes, reliability, availability and quality of the services should be scrutinized closely. Stringent SLAs with guarantees and remedies / penalties should be enforced. Banks should have the service provider work with their risk, security, and legal teams and aid in developing cloud migration plan. Where multiple providers are engaged, ensuring that the applications and data can be moved throughout the cloud environments, as appropriate is important. A good example of a third-party cloud solution is the Infosys' Cloud Ecosystem Hub. It is a first-of-its-kind solution helping enterprises build and manage unified hybrid cloud environment. The solution helps to rapidly create, adopt, and govern the cloud services across the clients' ecosystem.

In your view, what are the other key aspects banks should consider during their cloud deployment? I am interested in knowing your views.


Discover how Infosys can transform the way you do business>>


March 12, 2015

Core Banking Modernization

Core banking modernization - A myth or a reality? 

Modernization of core banking platforms has been discussed for a few decades, but is it a myth or a reality? Do banks need to undergo this change, and if so, how can they approach this  change to come out successfully?

Why do banks need to modernize their core systems?

Whether it is a small, medium or a large bank, traditional or non-traditional, local, regional or global, all types of banks face competition. Their survival and success depends on conflicting priorities such as reducing costs, increasing revenues, and increasing capital. In addition to this is the complexity of regulatory changes, compliance, and competition from banks and non-banking entities. It ultimately comes down to how banks will achieve growth in deposits, control customer attrition, and roll out innovative customer-facing strategies.

  • Banks need flexible core banking systems to address these strategies whether it is increasing their customer base, increasing deposits or servicing them better, origination and servicing of loans, offering products and services quickly, or assessing credit risk.
  • Ageing legacy systems are becoming costlier to maintain, difficult to support from technology, vendor support and resourcing perspectives.
  • A lot of time and effort is needed to make these legacy systems adapt to new functionalities, open up to new channels, support new products and services quickly.

 

core_bank.png

* Non Exhaustive

Before banks take on this challenge of core transformation, they need to find answers to the following questions.

1. Will the core transformation reduce operational costs and in what period?
2. Will the core platform change support the bank's future growth and vision, and if so how?
3. How will this change benefit in creating a better customer experience?

4. Will the change help banks get a better handle on managing risks and changing regulatory environment?
5. What is the plan to make employees at the bank skilled to embrace and adapt to this change?

Banks need to make a business case that details the value they will derive from modernizing their core systems. This change is not solely a technology-driven mandate but needs to be a strategic business decision. The detailed plans require an analysis of people, process, and technology costs while looking at business value, business process improvements, customer experience and value in parallel.

How can banks make this core transformation a smooth journey and come out successful?

  • Committed transformation with responsibility and accountability from the highest levels of the bank
  • Communication, governance, and stakeholder management
  • Transformation roadmap should be created by the business and technology teams together aligning with the bank's business vision and technology path
  • Professional program and project management,  robust delivery capabilities backed by a scalable engagement model
  • Requirements and scope management through the change
  • Evaluation of current business process to decide on re-engineering to align with the new product
  • Understand the implementation methodology, impact to the bank and customers, and how they will be handled
  • Infrastructural support that leverages the latest architecture and modernized applications
  • Organizational change management with involvement of the top management across impacted lines of business 

What have banks who have already taken this bold decision said?

Zions Bancorporation (assets > US$50 billion) made this decision due to factors such as risk concerns and systems with limited support. They also wanted to achieve common standards, , more centralization, and straight-through (STP) processing across the enterprise through this change.

Deutsche Bank (assets > US$2250 billion) made this decision to standardize processes, improve flexibility in IT infrastructure, and build modular functions through a service-oriented architecture approach. It was aimed at boosting efficiency, profitability, accelerate time to market new products and services.

Commonwealth Bank of Australia's (assets US$725 billion) vision through this change was to build a customer-centric organization, driving growth through simplicity. Their objectives were real-time banking, customer-centric processing, increased customer offerings, industrialization and multi-entity enablement. Customers experienced benefits right away - ability to see transactions quickly, accounts were opened instantly, account switching was done on the spot, transactions descriptions were simpler, and increased sales interaction.


Discover how Infosys can transform the way you do business>>

 

January 27, 2015

CRS - Establishing the new world order

This blog is co-authored by Jay Chandrakant Joshi (jaychandrakant_j@infosys.com)

 

Introduction to CRS
By now, most non-US financial institutions have put in place a framework for FATCA implementation, and, as per the implementation schedule, are well on their way to meeting reporting requirements for the first reporting period. At the same time, a new regulation called Common Reporting Standard (CRS) looms large on the horizon.

While a group of early adopter countries have already endorsed their commitment to CRS (a joint statement can be found here), for most developed and developing nations, their participation in CRS is more a question of 'when' rather than 'if'. Countries like Australia, though not present in the group of early adopters, have already declared their intention to participate in CRS and, by 2018, most countries are expected to participate in CRS.

 

CRS vs. FATCA
Common Reporting Standard, also popularly known as Global Account Tax Compliance Act (GATCA), is aimed at preventing global tax evasion. It aims to counter the beast of black money and tax evasion at a much larger scale than individual countries trying to fight their own battles. As a result, CRS promises to have a much wider appeal and bigger incentive for participation than FATCA. While FATCA focuses solely on identifying US tax residence, CRS takes a more global perspective and, as a result, the requirements become more complex.

Broadly outlined, the requirements for CRS are:

  • Identify all customers who are residents of any foreign country for tax purposes (as opposed to identifying only US customers for FATCA)
  • Obtain relevant reporting information from customers (for instance, obtaining the date of birth for all individual customers and tax identification number for each country of tax residence)
  • Report data to respective tax authorities (as opposed to just the American IRS)

Early analysis of FATCA-reportable client volumes in many financial institutions raises questions about the extent of technology investments and the choice between an automated or a manual approach. Based on minimal FATCA reporting volumes, many financial institutions have opted for a manual approach, treating FATCA as a standalone obligation without integrating it with existing processes. Many financial institutions have opted for a full-fledged automated FATCA solution that integrates completely with current on-boarding and customer contact management processes. The debate about the merits of implementing a strategic solution as opposed to a tactical solution for FATCA will likely linger on for a while.

However, for CRS, this volume is expected to shoot up drastically. A comprehensive change will be required across the board, from on-boarding processes to backend processes.

 

Implementing CRS
In order to build a sustainable framework for CRS, the first step should be impact assessment. A high-level analysis of citizenships or addresses of customers will provide an early indication of projected volumes of CRS customers. It will also provide an indication of the maximum number of reporting countries a customer is likely to have. This information is vital in designing customer on-boarding forms as well as back-end IT infrastructure. Although there can be any number of reporting countries for a customer, it may prove to be practically unfeasible to populate the number of reporting countries dynamically. It may thus be a good idea to fix the maximum number of reportable countries based on early data assessment in consultation with the legal department.

Once the impact assessment is done, the focus should then be on leveraging the existing FATCA and AML framework. For instance, the existing FATCA framework for the US can be extended to include other countries and the AML framework can be extended to identify 'Beneficial Owners' and 'Date of Birth'. Wherever possible, the current FATCA rules should be extended for CRS and, all the while, the focus should be on building a flexible, scalable solution so that the financial institution is well placed to tackle any such future regulation. Financial institutions can chose to train existing operations teams to handle additional responsibilities for CRS or build a separate team to handle CRS and FATCA. Such decisions can be taken after conducting impact assessment studies.

However, with increasing global focus on data reporting, the CRS implementation opportunity should definitely be used to strengthen existing data capture/reporting capabilities and upgrade legacy systems. This will place the financial institution in pole position to cater to all existing reporting obligations (like AML and CRS) and enable it to report data to multiple regulatory authorities while ensuring readiness in facing future obligations.

January 19, 2015

Banks - Don't Let Your Concerns Cloud Your Cloud Vision!

 - Anjani Kumar

Today, most banks endeavor to leverage IT for delivering bespoke, anytime, anywhere services and products underpinned by real-time analytical insights. Unsurprisingly then, at an average ~15% of total costs, banks' IT spending is the highest among all industries.

Understandably, banks are looking for impactful ways to bring down this cost.  Cloud Computing is a superlative means to achieve this goal, and can play a key role in helping banks transform their operating models. The Cloud enables secure deployment options, effective collaboration, new customer experiences, and shorter time-to-market for new launches. Cost savings and flexibility (through minimal capital investment and a pay-per-use billing model), enhanced business agility and continuity (through robust upkeep of the Cloud environment), and Green IT (by reducing both carbon footprint and energy consumption) are just some of the benefits of Cloud Computing.

It is estimated that by 2020, around 40% of the information in the digital universe will be Cloud-enabled. Further, research estimates that in about 18 months, more than 60% of banks worldwide will process high volumes of transactions over the Cloud.


Following are some of the functions that proactive banks have successfully migrated to the Cloud:

Mortgage/Lending Origination: Private / Community Cloud-based integrated collaboration lending platform -enabling customers to apply for loans and complete processes electronically

Channels:
• Enhancement of customer relationships through consistent cross-channel experience
• Channel management (kiosk, ATM, online, mobile, call center) and content management using Private Cloud

Payments: Modernization and standardization of transaction processing

Micro Banking: Micro banking business execution

Analytics: Customer data integration across banking platforms for providing near real-time insights

Desktop Management: Centralized management of employees' desktops for greater remote management flexibility

Collaboration:
•Access to bank's systems for branch employees via a secure Cloud
•Enablement of all customer engagement dimensions

Enhanced Business Services: Enablement of third-party services to extend banking ecosystem

New Service R&D: Research and development of new services

 

In spite of Cloud Computing's huge potential, many banks, beset with the following concerns are reluctant to leverage it:

Security: This remains a key concern. Many banks believe that the security and confidentiality of personal and commercial data is at risk in the Cloud.
Regulatory requirements: Many countries require banks to maintain their financial and customer data within the national boundaries.  Consequently, banks are concerned about the exact location of their data in the Cloud.

Operating control dilution and higher risk: Banks are concerned about an increase in operational risk, and its potential adverse impact on business and reputation should services over the Cloud be hampered.

Uncertainty over long term cost impact: Banks are concerned that a major strategic decision, such as switching to a Cloud-based model, is not easily reversed. They are also concerned about a potential lock-in with the Cloud service provider and unsure of how to bring the services back in-house later, if needed.  They are concerned of the huge cost and risk implications if IT systems need to be brought back in-house at a later point in time. They are also not certain of the long term cost implications of moving to the Cloud.

Concern over Cloud service providers: Banks are worried about a relative lack of standards for integrating Cloud service providers' services with their own servicing needs. They are also concerned that the solutions of many Cloud providers (across functional, operational, technical and commercial models) lack maturity. Many Cloud service providers lack proven credentials and a successful track record. Banks are also concerned that if the Cloud service provider suffers a major outage; it can have huge adverse implication for banks. Not even big Cloud service providers like Amazon Web Services have been totally immune to major outages.

That being said, success stories of banks' migration to the Cloud abound. Here are just two examples. Bankinter, the 6th largest bank in Spain, has crashed the time taken for credit risk simulation from 23 hours to just 20 minutes using the Amazon Cloud.  Commonwealth Bank of Australia cut its costs by half by moving its storage to the Cloud, and also achieved huge cost savings in app development and testing.
So what should banks, unsure of how to go about Cloud migration do, to reap immense business benefits? My next blog will provide actionable recommendations. Stay tuned...

December 16, 2014

Harnessing Big Data in Banking

- Anjani Kumar

Proactive banks understand that Big Data can be harnessed for risk-based, real-time pricing, unified customer view, product differentiation, compliance and risk management, fraud detection and prevention of false positives, product and service development, customer segmentation and targeting, customer retention and loyalty enhancement, cross-selling and optimized offers, and much more. 

Many banks have implemented Big Data solutions and are leveraging technologies like Hadoop for integrating heterogeneous reference data sources and distributing their data across the bank in real-time. Others are using Big Data solutions for globally integrating assorted banking solutions for better business decisions.

But away from the success stories, a large number of banks are still struggling with the why and how of Big Data, unable to capitalize on its opportunities.

Here are a few ideas on what banks should do to improve the effectiveness of their Big Data solutions:


1. Strategic Planning: Banks must include Big Data in overall strategic planning. Instead of focusing only on the internal data in a few business areas, they must consider Big Data holistically, including data enabling a single customer view, as well as that related to product, service, regulatory compliance and risk. It is also necessary to focus on external data - not just credit scores or market data feeds but also data from social and streaming media and more. In the early stages of Big Data implementation, banks could fully leverage in-house transactional data before turning to external data sources. Technology teams must identify and prioritize the areas of high business impact, such as customer-facing processes like sales generation and lead enhancement, to be targeted first. Banks must evaluate off-the-shelf solutions to see if they suffice or whether a deeper bespoke solution is needed. While Big Data implementation could start as a small standalone piece, it is important to integrate it with existing systems and applications. Maintaining balance between cost and function, and technical requirements and privacy considerations is crucial. As far as possible, only one copy of the data should be maintained to ensure reliability. 

2. Robust governance and operating model. The Big Data and Analytics operating model and governance policies should be clearly defined. It is important to define how analytics would be embedded into the business, and the roles and responsibilities of all concerned. An executive champion must be empowered to enforce data discipline and governance across the organization. Drivers, objectives and success metrics must be clearly defined. Senior leaders must help to clear stumbling blocks. Predictive modeling must be used to run 'what if' scenarios and their associated cost/benefit analyses. A Big Data and Analytics innovation lab could facilitate quick idea generation and experimentation.

3. Information architecture. Banks must evaluate the robustness of their information architecture to ensure it can support the increasing complexity, volume and velocity of data. Most banks have complex base information architecture, spanning numerous products across myriad lines of business, geographies and channels. The new information architecture should enable an integrated, detailed view of enterprise-wide data and relevant external data, and facilitate data consistency, accuracy and auditability; it must also be agile, flexible and extendable. Formal data controls and governance will protect data integrity. To attain enterprise-wide data integration, banks should define their enterprise data architecture and roadmap, enable cross-functional data integration projects, enforce measurement of data quality, and institute processes for addressing data quality issues.

4. Privacy and security. Banks must leverage Big Data approaches to supplement fraud and risk management systems to bolster security and privacy. This will improve customer confidence and experience and also enhance the transparency of security processes.

Besides implementing the above, banks can learn from success stories like the following:


IBM has been enabling Mexico's Banorte Bank map out a new banking model and is using Big Data, marketing automation and innovations in analytics to create more personalized interactions and keener insights into consumer banking behavior. The Bank has redesigned its systems to advise staff on the products best suited to individual customer needs.

In 2014, Bank of North Carolina showed the way for community banks by enhancing investment in Big Data visualization. Using SAS Visual Analytics it standardized reporting, improved validation and report control and also enhanced speed and usability.  The Bank can aggregate data better for portfolio reporting, as well as implement detailed reporting across personnel and business lines

October 10, 2014

Emerging person to person payments business landscape

A WSJ report states that in US, each year the person to person payments are roughly about $900 billion that includes payments by cash, check, online, and mobile payments.  Online and mobile money transfers are standard features of most of the banks and payment providers like PayPal. In addition, money transfer operators like MoneyGram and Western Union are major players in both domestic and international person to person payments business. The incumbents - banks and money transfer operators have well established networks and systems in place to cater to a large global customer base. However, the person to person payments market is seeing increasingly new disruptive players. Convenience and lower fees are the two important levers used by these new players to take on the incumbents. The incumbents are under threat from three emerging groups of players in the person to person payments market.

Continue reading "Emerging person to person payments business landscape" »