« Suffering in Silence | Main | Oil and Gas Meets Big Data »

Is Traditional Data Management Still Relevant?

Noah Consulting blog series: When Oil & Gas Prices Recover storyline

Is Traditional Data Management Still Relevant?

Data management plays a key role in helping organizations create value from their data assets by enabling advanced analytics, reporting and traditional transactions. With lower oil and gas prices putting significant pressure on operators and oilfield service companies, gaining new insight from data to improve efficiency is no longer a luxury.  But this comes at a cost - as the "information intensity" of oil and gas operation increases, new challenges are placed on the data management infrastructure creating a data dilemma.

Historically, the oil & gas exploration and production companies are organized in disciplinary and operational silos aligned with the asset life-cycle. As the project "moves" through its life-cycle, it is transferred from department to department and it is often assumed that the associated data moves also - seems logical. These departments (exploration & appraisal, drilling & completions, production, etc.), their associated sub-groups, (geophysics, reservoir management, facilities, production engineering, operations and maintenance) and the corporate departmental silos (finance, procurement, land/lease, etc.) result in a virtual duck soup!

Each group has their own set of tools, a specific application portfolio that is fit-for-purpose. Corporate departments tend towards a single large platform (ex., SAP for ERP) but the other departments use diverse applications from a variety of vendors making data integration a challenge. While the digital oilfield encompasses a great deal more data, actionable insights into specific elements of the value chain are easier done inside the silo than cross-discipline.

The ultimate goal should be to create an asset-centric focus by integrating silos of functional and operational information. When operators begin tying work processes together to take advantage of the information derived at each stage, they gain a holistic perspective to base larger initiatives in efficiency and improve long term profitability.

As the industry moves towards this information integration, it is also adding unprecedented volume as some producers are already collecting up to 1 terabyte of data per well, per day. Large organizations have data collections that add up to hundreds of terabytes or even tens of petabytes in size. Many organizations that collect smaller amounts of data are struggling to analyze much of the data they collect.

IT departments normally have mature processes for managing data which, at the least, have data retention and security policies for protecting the most critical data.  Other elements of a comprehensive plan include defined systems of record for each function (a home for most data), a master data management system with the "gold version" of each critical information object, ETL routines for transferring data from source to enterprise data warehousing and standard reporting tools for routine reports. Additionally, there should be separate data management solutions for documents, transactions, models and sensor data.

This environment was built to solve the business challenges from yesterday's perspective, but how are they coping with today's world? With the Big Data Three Vs (volume, variety and velocity) changing the playing field, can IT respond to higher expectations with a traditional architecture? How fast can internal IT respond to data loading, data mastering, data quality checks, data modeling when the volume is rising, the variety is growing, the pace is accelerating and the need for trusted data, integrated into a more holistic view is now the minimum requirement?

Sadly, I think traditional solutions are coming up short. That is not to say they don't have an important role still to play, but despite internal IT services improving, the gap is growing as the three Vs are becoming a powerful force in business. Many of the new demands are coming from operations, traditionally not a large user of resources, as a result of the integration of sensors and the industrial Internet of Things (IoT). When IT cannot keep up they are not invited to the planning meetings when important business projects are discussed and they are even left out of the vendor presentation for the new technology alternatives.

The business requirements that drove the traditional solutions are still there, especially from the traditional drivers for IT, finance, HR, procurement. However, if a business reaches the point where they have given up on the expectations that IT can move at their accelerated pace, there are consequences, not just for IT but for the success of data initiatives. Despite the perceived slow pace of internal IT, they do have important controls for data quality, data governance, information protection and master (and meta) data management. Leave IT behind and most likely these processes are also left behind. Does data quality no longer matter when you have a lot of statistics to compensate? Actually there are activities when relevant patterns can be identified even with poor data quality outliers in your data, but there are other activities when your data needs to be right in order for you to make the correct business decisions.

In a perfect world, the right way to solve this dilemma would be for IT to speed up, businesses to listen to IT's issues, vendors to partner and consultants to facilitate alliances. But will the urgent by pass the important, will the temptation to jump outweigh the need for caution, will the marketing hype leap over the proven?

There is no substitute for good data management and it is everyone's job to manage data. That doesn't mean everyone is a DBA or a programmer, but an effective governance process assigns roles and responsibilities and the management structure rewards each employee when they do their jobs. Analysts (and most everyone is an analyst of some sort) will be able to trust, access and use standard sets of tools to perform their jobs accurately and efficiently. And if possible, companies should establish a center of analytics excellence where data scientist fish in their data lakes to find innovative insights with a designed architecture to keep the hybrid ecosystem performing.

So to answer the question, "is traditional data management still relevant?" Our answer is Yes, but... Yes, the basic premise of good data management practices has not changed. Data governance, data quality management, meta and master data management processes are as relevant in the old world as the new. But as changing business requirements present new challenges to the traditional data management team, new emerging technologies are opening up opportunities to manage different types of data, greater volumes of data and bringing near real time operational data to the engineers' and analysts' desktops. Data management organizations cannot stand still - bringing the traditional expertise and disciplines, integrating them into new processes and technologies will solidify the value and relevance data management teams provide in evolving data-driven enterprises.

Here are a few thoughts to consider in evaluating data management relevance:

1) Keep the Existing Data Environment:  Don't throw it away but ensure it provides a flexible architecture that maintains the traditional solutions and grows new capabilities.

2) Focus on Data Governance:  Who owns the data? Include a shift towards business ownership of critical data which will result in greater value from data investments and change decision-making responsibilities, accountability, data flows and processes. Inevitably, it will change the way your business works--for the better.

3) Data Quality Matters: Bad data leads to bad decisions (though not every decision requires perfect data). Don't pollute your data lake or your data warehouse with poor data quality. Information protection, individual privacy and intellectual property, is essential.

4) Experiment with New Technology: Advances are realized through experiments, but remember the company relies on consistent, accurate, and compliant data on which to base best practices, innovations and good decisions.

5) It's the People:  Good data management requires experienced and valued data professionals. Invest in the team that will take care of one of your most valuable assets - data.

 

 

Post a comment

(If you haven't left a comment here before, you may need to be approved by the site owner before your comment will appear. Until then, it won't appear on the entry. Thanks for waiting.)

Please key in the two words you see in the box to validate your identity as an authentic user and reduce spam.