The Infosys Labs research blog tracks trends in technology with a focus on applied research in Information and Communication Technology (ICT)


May 14, 2012

Infosys @ Nvidia GPU Technology Conference 2012

Hi There, 

I am super excited to tell you that I will be presenting some of the work that the High Performance Computing team @ Infosys has been doing using GPUs at the annual Nvidia GPU Technology Conference at the McEnery Convention Center in  San Jose, CA. While the conference itself kicks of in a few hours from now, the Infosys talks are scheduled on 16th, i.e. Wednesday.

The first talk is titled "Fast Adaptive Sampling Technique For Multi-Dimensional Integral Estimation Using GPUs". This is happening in Marriot Ball Room 3 at 2:30 PM.

The second talk is titled "GPU Based Stacking Sequence Optimization For Composite Skins Using GA". This talk is happening in Room K at 3 PM.

The subject of the first talk is an algorithm called VEGAS. VEGAS is a variance reduction technique that hastens convergence of a Monte-Carlo integration. This algorithm has wide applications from Computational Finance to High Energy Physics.

The subject of the second talk is a genetic algorithm that's at the heart of aircraft wing manufacturing. Modern aircraft wings are manufactured using composite materials. Sheets of these materials have to be overlaid on top of one another such that ability of the wing to sustain high stress in flight is maximized while at the same time minimizing violations of constraints that dictate what's an admissible ordering of the materials. 

I will elaborate on these short summaries of these two talks in subsequent blog posts over the next couple of days.

If you are going to be at GTC, kindly make it convenient to attend these talks. I will glad to meet you and tell you all the good work that we have been doing in the area of GPU computing in our labs and I would be equally excited to know about some of the coolest ways in which you are using GPUs too or else leave us a comment here on the blog. I will get back to you and we can engage in some geekery. 


February 17, 2012

Is Parallel Computing a Rocket Science or Esoteric? Part 1

My association with the field of High Performance Computing has been intriguing and a journey of revelations where in I have tried to understand the intricacies of a subject that has been long under the hoods. It just seems so recent that it has been rewarded with its much awaited 'Glory'.

I make here a humble attempt to bring to you my understanding of this so called "Dark Science" considered by many only a esoteric craft. I bring to you a 3 part series describing "The past, present and future of Parallel Computing through the eyes and experiences of a commoner"

You pour yourself a cup of hot brewing coffee and descend in a chair sipping on it, while reading your early morning dose of news. A common daily routine for each one of us, so what's so special? We seldom happen to appreciate that the trifling that surrounds us influences our broader picture of life immensely. The coffee with the newspaper was a classic example of multi-tasking or doing things in parallel. And this is exactly what we do in Parallel Computing aka Processing (PC); go about doing or trying to do computing simultaneously.

From ancient ruins dating say a 100BC which gave us some tablets and abacuses capable of doing computation in parallel to the many-core architecture cutting edge parallel computer architectures today, the journey has been intriguing and of a realization. Each milestone reached in this journey involved imbibing something from simple real life to make it a breakthrough in the technology world. For example, Prof. Dave Patterson's Laundry Example; outlining the principles of pipelining in parallel computer architecture. Goes to say is we all know Parallel Processing aka Computing, it's just that we never realized we did.

Though, the IBM 704 with its Principal Architect Gene Amdahl has been regarded as the first commercial breakthrough at creating a machine with floating-point hardware in 1955; Wikipedia tracks back the true origins of Parallel Computing (aka MIMD parallelism) to Federico Luigi, Conte Menabrea and his "Sketch of the Analytic Engine Invented by Charles Babbage" in 1842. This work by Luigi can be regarded as the first treatise describing many aspects of computer architecture and programming.

To be continued...


July 1, 2011

Is online banking going to be easy for banks to manage?

Banking has changed a lot with the advent of Internet banking as a channel for reaching customers. Customers are seeking more comfort in accessing banking facilities as well expect their banks to deliver value added services through channels like Internet banking. With customers expecting more return on investments, many banks are offering pure internet banking as a solution which gives more earnings to the customer. The overall banking landscape is changing as banks opt to provide pure internet banking facilities as part of their strategy to meet customer demands along with existing regular internet banking facility tied up to their other channels - branches, mobile banking and phone banking. How to make internet banking (pure and regular) more profitable for banks and competitively attractive?


Not stopping there, retail banking over the net has changed with the mushrooming of many online banking applications which help customers in managing multiple bank accounts, account aggregation, manage loans, categorize transactions, plan spending and earning, analyze finance pattern, help in switching banks and in a nutshell manage finance effectively. Few of the famous online banking applications include Xero, Wesabe, Mint, Money Manager, Expensr, and Smarty Pig. How banks can leverage these online applications to support them to reach customers effectively?


The answer to these questions lie in how banking firms are banking upon these online applications for online accounting and financial management services which add value to customers - in short how to leverage cloud computing for success in performance of internet banking. Internet banking is not simply funds transfer, bill payment and other vanilla services any more; but having an all encompassed internet banking application on their own is also out of scope for banks. These things make it is necessary for banks to take the services of online applications through cloud computing.


It will become essential to understand two important aspects to become competitively ahead in online banking: 1) Classifying online banking cloud based applications available and how to leverage these applications for providing customer focused functionalities and 2) Is it possible to have a 'Composite' banking application services/services provider who shall provide basic banking functionalities as well value added functionalities through the cloud wherein the service provider shall become the internet banking face for all competing banks.


Banking is going to be easy for customers but not for banks/service providers/application developers unless they leverage technology and adopt themselves to become a superior value provider!

March 31, 2011

An Introduction : Markov Model

In the field of computing, 'pattern recognition' is a multi-facet instrument to achieve distinct goals.Be it the field of Gesture- Speech-Handwriting recognition or Statistical Software Testing or Software Reliability prediction  it has always proved the most concrete method to empower the field of Artificial Intelligence.

Markov model is the most exploited technique for 'pattern recognition'. The core principle of Markov model "the future is independent of the past , given the present" highlights the basic feature of it. A  Markov Model , for any selected system , is the relationship among all possible states of the system.This relationship is basically depicted through the transition paths, rate of transition between those states and the probability of the system being in any of those state.And with this relationship modeling the next state of the system can be predicted.

For an example System XYZ can be either in ON or OFF state then the markov model will be :

markov.pngHere the λ denotes the rate of transition from ON state to OFF state.So the probability of the system to be in ON and OFF state can be denoted as P_ON and P_OFF respectively.And these probabilities decides the prediction of the future states.

Markov Model has already proven its' utility in the field of gesture recognition, speech recognition , handwriting recognition , statistical software testing , predicting the reliability of the system ,and other numerous applications. So Markov Model can be a promising modeling technique to enhance the existing technologies.

With the increasing complexity of systems and inability of Markov model to model all aspects of the system , more sophisticated and exhaustive models like Hidden markov model , Hierarichal markov model and Logical markov model etc are created. This clearly reflects the extendible nature of Markov model.

December 23, 2010

A Step towards Cloud Interoperability : Open Data Center Alliance

With the turn of technological revolution by Cloud Computing ,the race to become the leader in the same has given birth to some new problems.

The most immediate problem is lack of "Cloud Interoperability" .i.e the ability to use the same resources across variety of clouds. But every cloud service provider player has come up with their own unique way of inter - cloud application interaction or user interaction . This process of creating and using the isolated & different APIs has become a constraint for the user.As this has limited the choice of clouds , vendors and respective instruments .Thus the inter connectivity between the different clouds can become difficult. Moreover it has defeated the concepts of portability and integration. And will surely lead to the fragmentation of clouds.

To overcome this hegemonic hurdle the 70 major IT user groups (like Lockheed Martin, BMW, China Life, Deutsche Bank, JPMorgan Chase, Marriott, the National Australia Bank, Shell and UBS) whose total IT spending is more than $50 billion ,have joined hands under the name of "Open Data Center Alliance".

As per the official website ( "The Open Data Center Alliance is an independent consortium comprised of leading global IT managers who have come together to resolve key IT challenges and fulfill cloud infrastructure needs into the future by creating an open, vendor-agnostic Usage Model Roadmap."

So the Open Data Center Alliance will bargain to ensure interoperability across core technologies related to network and cloud.And we can expect a bright and INTEGRATED future for the clouds.

November 26, 2010

Extend Spring Security to Protect Multi-tenant SaaS Applications

Spring Security, the open source security framework maintained by SpringSource, is a widely used framework that enterprises use to address the security requirements of enterprise Applications. It provides a comprehensive security solution that supports a wide range of authentication mechanisms, including HTTP authentications and authentication based on forms, LDAP, JAAS, OpenID, etc.

However, Spring Security currently does not provide out-of-box features to address the security requirements of SaaS applications. In this article, we present a solution to extend the JDBC- and LDAP-based implementations of Spring Security to address the multi-tenant security requirements of SaaS

Cloud Application Migration

The business value of a cloud model does not require any special emphasis to enterprises.
The Infrastructure elasticity promises to offer the desired flexibility that allows businesses
to penetrate new and emerging markets without the risk of a significant capital expenditure. The free cash flow made available will allow businesses to increase spending on R&D or other strategic initiatives. For small and medium businesses, Cloud Computing offers an ecosystem that allows them to co-exist, if not compete, with large businesses. For large businesses, it's a natural progression of how IT could better optimize its data centers and deliver more value for the business.

What are the real challenges that make organizations take a cautious, wait-and-watch approach to cloud adoption? The reasons are many. For one, concerns with regard to security and regulatory compliance overshadow other potential means through which one could benefit from cloud. It's necessary to uncover such possibilities for businesses to evaluate the potential of cloud computing. Put another way, we will explore the questions most business leaders ponder over - Is 'cloud computing' the logical next step for me to successfully execute business strategy? If so, what should be my cloud strategy? Which applications are best suited to run on cloud?

These are the questions we discuss, attempt to answer, and where required, make suitable recommendations in this article written by Ashok and me.


November 23, 2009

Business Intelligence and Analytics - Challenges and Trends

In this post, I am planning to address the Business Issues and Customer challenges in today’s dynamic environment of Business Intelligence and Analytics. I will also cover the trends which are emerging and re-shaping the way corporate and organizations are doing their Business Analytics. As a consequence of recession during 2009, and the downturn, industry has seen the differentiators between organizations would be the strategies they are adopting especially on the line of alignment between IT and Businesses.

Continue reading "Business Intelligence and Analytics - Challenges and Trends" »

August 12, 2009

DMaaS on the Cloud: Requirement

Cloud computing is an emerging computing paradigm that was innovated to deploy cost effective solutions over Internet. Companies such as Google, IBM, Amazon, Yahoo and Intel have already started providing computing infrastructures for its intend use. On the other hand, organizations have started porting their data and applications to cloud in order to reap the benefits of this new paradigm.  This necessitates a well-defined data migration service (DMaaS) over cloud. Moreover, organizations have to rely on cloud in order to handle security and privacy issues for their data assets. Some of these can be overcome by building a private cloud and given a cloud with certain amount of data, we need solutions to migrate this data to another cloud.

Thus, various scenarios can be visualized for migrating data with-in and around cloud: (i) enterprise to cloud data migration and vice versa and (ii) inter-cloud data migration.

Please share your views on DMaaS.

Subscribe to this blog's feed

Follow us on