How Can Companies Realize Business Intelligence ROI Faster?
If you ask any group of business executives, they will be nearly unanimous when it comes to vocalizing their need to use analytics to make better business decisions. However, ask the same executives whether or not the actual ROI they get out of large business intelligence initiatives meets or exceeds the hype, and you will usually get a very different answer. Unfortunately, in the eyes of the business users, many view business intelligence as a "money pit" they can't afford not to invest in...
Obviously, there can be many reasons why a business intelligence program either fails or isn't perceived to be as successful as originally promised. Just like any other IT-enabled business transformation initiative. For example, they can fail because of poor project management, bad requirements, poor scope control, inaccurate linkage of requirements to business value, etc. Above and beyond the usual causes, however, many business intelligence projects fail because of reasons specific to the discipline itself. To understand this point, it is important to understand the typical approach a majority of IT organizations take when building a business intelligence solution.
Following approval and funding to move forward, IT organizations ultimately spend an inordinate period of time designing and building application logic to integrate data from multiple sources (typically called ETL, or 'Extract, Transform, Load') into the data warehouse. Unfortunately, until this phase is complete and the data is consolidated, only minimal business value can be realized. That is because the most powerful analytics are frequently those that require the data to be integrated from a multitude of sources. So in essence, this traditional approach mandates that IT goes away for 6 -12 months (or longer sometimes), builds the ETL to integrate the data. Only then can reports and visualizations add value.
Why is this bad? First, because business conditions and priorities change in a flash in today's global marketplace, forcing companies to change strategic direction much faster than ever before. For example, transitioning from being 'product-focused' to 'customer-focused', adapting to technology innovation, expanding across a value chain, or completing acquisitions radically impact what kinds of analytics a company needs for better business decision-making. Second, even without changes in business strategy, changes in management and changing org structures introduce new decision-makers into the project. Unfortunately, new decision-makers frequently have different priorities than their predecessors.
So in essence, just like in many sports, 'speed kills'; or maybe more appropriately in this case, 'the lack of speed kills'. To solve this problem, many advocate using an iterative approach such as an AGILE methodology. At Infosys, we agree that leveraging an iterative methodology is part of the solution. However, it cannot be the entire solution. This is NOT a problem that can be solved by methodology alone. This is where 'Data Virtualization' comes into play.
Leading vendors such as Informatica, IBM, Composite, and Microsoft have built very robust platforms. If you aren't familiar with data virtualization and you are in the business intelligence space, you really should look into using these types of tools. While they do not eliminate the need for traditional approaches, our experience is that the amount of ETL can frequently be reduced by 50%. For example, for a major financial services client of ours, we were able to help them reduce the total effort for data integration (ETL) by 200+% and cut the development in half versus traditional methods.
The end result is that the length of time required to deliver business value is dramatically reduced, giving business and IT teams more time to deliver cutting-edge data visualizations - which is where the value gets realized!
What do you think? We'd love to hear your comments - good or bad!