Discuss business intelligence, integration, compliance and a host of other SAP-related topics – implementation, best practices and resources to negotiate the world of SAP better!

« Business Transformation focus on value - Our Value Realization Model (VRM(TM)) | Main | Unlock More Value from Your SAP Enabled Business Transformation Faster with Infosys' SAP Industry Pre-Configured Solutions »

How In-memory solutions will revolutionize the way BI solutions are designed and built?

Guest Post by

Nitin Vilas Kulkarni, Principal Consultant, Retail, CPG, Logistics and Life Sciences, Infosys

With memory prices coming down steeply, in-memory solutions are becoming more affordable. Gartner expects that by 2014, 30% of analytic applications will use in-memory functions to add scale and computational speed. We are already witnessing solutions and appliances like SAP HANA generating significant interest. As in-memory solutions become main stream, I think we need to re-visit fundamentals of data warehouse design and development to leverage power of these solutions.

Today, most data warehouses are build using layered and scalable architecture concept where there will be distinct layers for staging, integration, reporting. There is lot of data duplication in reporting layer as the layers are built for good reporting performance. I think the design of DW usage, design and development will undergo changes in the following areas. Here is my top 10 list (not ranking):

  1. Embedded analytics enablement - In memory solutions will allow analysis of large data quickly and thus will support embedded analytics in business processes for better decision support. Hence, the design of warehouses will need to support this to take full advantages of the technological changes.
  2. Data integration at run time from multiple systems - Most data warehouses require extraction and storing of data in warehouse where reports are built.  While tools such as Business Objects offer functionality of combining data from multiple systems in semantic layer, the real life usage was limited due to performance challenges. In-memory technology will support this design better and reduce physical movement of data across systems.
  3. Creation of standard predictive and statistical models - While most companies desired to create predictive and statistical models for mining, the implementation was constrained by computational power. In-memory technology combined with parallel processing will make such models main stream.
  4. Tightly integrated consolidation and simulation applications - Planning and consolidation applications were built separately (and not in data warehouses). The in-memory solutions will help you built the application in your warehouse with calculation engines running in the same warehouse. This will help in reducing latency and data redundancy.
  5. More flexible hierarchies and master data - User defined hierarchies and master data can be supported better. The new design will allow for such designs will power of in-memory solutions where results need not be pre-aggregated.
  6. Elimination of layers - The layered architecture leading to data redundancies can be eliminated for the large part.
  7. More scale leading to mindset change about data - Most data warehouses were built considering data volume limitations in mind. These limitations are going away and you can now retain data longer, bring in external and unstructured data for analytical needs. The functions such as text analysis, pattern recognition can be supported and design will involve the same.
  8. Consolidation of data warehouses - Many organizations adopted strategy of building multiple landscapes (regional, global) to overcome challenges of data volume and address performance issues. Such warehouses can be combined due to compression of data.
  9. Self service oriented design - The design must enable self service for business. Hence, IT must enable data and allow business to take control of analytics. While this concept is not new, in-memory solution will allow good performance without need for IT department to build architected data marts.
  10. Iterative development methodology - The in-memory solutions will give ability to perform development iteratively since models can be built and changed easily. Business will be able to 'experiment' on data and thus iterative development methodology will yield best results.

About the Author: Nitin is  Principal Consultant currently part of Retail, CPG, Logistics and Life Sciences unit at Infosys. His areas of expertise include Business Intelligence with special focus on In-memory solutions and BI architecture.

 

Post a comment

(If you haven't left a comment here before, you may need to be approved by the site owner before your comment will appear. Until then, it won't appear on the entry. Thanks for waiting.)

Please key in the two words you see in the box to validate your identity as an authentic user and reduce spam.

Subscribe to this blog's feed

Follow us on

Blogger Profiles

Infosys on Twitter