« Is Traditional Data Management Still Relevant? | Main | Re-Imagining Data Management in Oil and Gas »

Oil and Gas Meets Big Data

Oil and gas companies are familiar with the concept of Big Data but have not adequately addressed the volume, variety and velocity being generated nor capitalized value it could add to their organizations. Significant competitive advantage will be found by companies who learn to efficiently use their daily petabytes of data to identify trends and anomalies for timely, accurate and enterprise-wide decision making. Companies that harnesses the power of their data will discover a level of efficiency previously unobtainable. The key is to manage Big Data to interpret, react and predict the best course of action. It is not the company with the most data that wins, it is the company that uses their data on which to base their decisions.

The oil and gas industry has been severely challenged by the prolonged steep drop in market prices which led to streamlining of most operational processes.  How quickly and accurately an enterprise made course corrections is a large determining factor in its overall viability and long term value. The perception is that the next level of profitability, without additional large expenditures and headcount, will be obtained by optimizing incoming data. 

A well-managed Big Data program will also have company-wide benefits including increased productivity as teams will know where their data is stored, how to access it efficiently and won't waste time looking for and recreating data. There will be less delay between field and C-suite resulting in better alignment and understanding of the current state. Big Data's true value is realized when organizations use new statistical models and patterns identified by machine learning algorithms and take action on the results. Or said another way - analytics.

We've collected it - now how do we use it?

A substantial amount of data being generated is never utilized because companies don't have a comprehensive Big Data management plan and infrastructure.  One estimate suggests that corporations only process 20% of the data they collect. Another survey found that an operator only uses 5% of the data collected on an offshore drilling rig due to data storage, transmission and commercial constraints. Infrastructure goes beyond data storage, and is the capacity to warehouse, search and model data.

There are both technological and organizational issues to consider when moving to operationalize analytics. Technology is evolving to include embedding and integrating analytics into dashboards, databases, devices, systems, applications, processes and more. The organizational implications will need to be addressed, as well, such as who owns the data?  The success or failure of any Big Data and Analytics project is based on user adoption.  If the platform/program makes the data available, trusted and easy to use, the program will be successful.

Big Data requirements also depends on focus - be it by company or department. For example, an exploration group may find an advantage in evaluating new plays and understanding how their experience will transfer from field-to-field. A production group might focus on maximizing output, reducing risk while tracking and optimizing scheduled maintenance for decreasing down-time or non-productive time.

 

While it might sound intimidating, a Big Data infrastructure stack will be needed with an architectural design that introduces the analytics platform into the computing and processing ecosystem. Keep in mind, however, this technology doesn't replace legacy data management and processing environments, it complements them. Once analytics are running and real-time data starts streaming in, data should be evaluated against historical data to determine whether it's within anticipated bounds.

Emerging developments in Big Data technologies are proving essential to capture, store and process data that will provide a foundation for the real work. There are a number of ways to evaluate requirements to manage this data, but one approach is to categorize it into three attributes: volume, velocity and variety.

Volume

The volume increase isn't surprising, however, the amount is staggering. Just looking at the amount of seismic and well log data significant growth is due to larger surveys, more channels in acquisition methods and closer sample intervals (in time and space). Data gathered from actual drilling and logging activity can also be measured in smaller increments thanks to tools now available (including fiber optic cables), boosting the overall amount of data. The amount of data gathered from production activity has also increased due to placement of sensors and telecommunications networks that relay information to the operator on a real-time basis. Multiply the variety of sensors being deployed - (including downhole pressure, temperature and vibration gauges, acoustic, electromagnetic and flowmeters), by the number of wells - and well, there's your Big Data.

There are existing solutions for data volume challenges from data appliances, in-memory devices, unified storage grids and high performance computing grids (Linux based, using GPUs instead of CPUs and high-speed interconnect solutions with distributed computing techniques).

Velocity

The new Big Data infrastructures are capable of ingesting large volumes of sensor data at high velocity. Information architects will have to be able to combine master and meta-data with tag names from historians in line with the governance in place. As more data is being collected in the field through some form of process control networks (including SCADA or DCS systems), more real-time data is available for operators and engineers. Collecting and analyzing the data in order to make better decisions, reduce downtime, increase production and optimize operations is most effective with a strong strategic plan that is executed by a collaboration between operations and IT staff.

Variety

Traditional Business Intelligence solutions have been developed for reporting and tracking pre-defined metrics from structured data from one type of data (e.g., management scorecards). The future holds new insights available from exploring all the relevant data types: structured, semi-structured, documents, transactions, field measurements. Add in new types of data from email, text, video and social media and the variety challenge becomes more apparent. The data volume is growing but you can't afford to get lost in the datasets.

 

Here's a snapshot of a possible future - operators and service companies sharing data to create a holistic view of an entire operation. Analysis will begin with individual wells, then by development fields and entire portfolios - all information will be integrated and provide near real-time information to the organization. Decisions will be based on the most current statistics without gaps or blind spots. Resource allocations, projections, supply chain, finance - every essential function in the organization will be both users and contributors on the same page of the data picture. As analytics becomes an integral part of a business' processes, more people will potentially touch the analytics until everyone becomes a data scientist on some level. Yes, Big Data can do all that!

Post a comment

(If you haven't left a comment here before, you may need to be approved by the site owner before your comment will appear. Until then, it won't appear on the entry. Thanks for waiting.)

Please key in the two words you see in the box to validate your identity as an authentic user and reduce spam.