« Operational Technology (OT) vs. Information Technology (IT) | Main | AI Coworker »

Battle for the Digital Core

While no one is popping corks on the champagne yet, things are looking a little better in the oil patch these days. Oil prices have recovered to a range between the mid $40s to the low $50s. Drilling and production costs in some basins have fallen to the point where $50 oil can mean a positive cash flow again. The Permian Basin looks healthy with lease prices in the best trends near $60K per acre, and increased drilling activity has added hundreds of drill rigs to the fleet over the past few months. OPEC has announced a production cut of over a million barrels of oil per day and crude oil inventories have fallen a little. A few major capital projects have been approved including:

·         Chevron's Tengiz Future Growth Project (Caspian Basin)

·         Statoil's, Johan Sverdrup (Norwegian North Sea)

·         BP's Mad Dog 2 (Gulf of Mexico)

·         Eni's Zohr (offshore Egypt)

This appears to be a sign we have reached the bottom of the commodity cycle and can start planning for better times.

I am not brave enough to try and predict future oil or natural gas prices or even global demand for fossil fuels, but I do want to talk about some of the interesting new developments in the digitization world that apply to our industry. I want to introduce a concept I call "the battle for the digital core." If the industry is preparing for new investments, investing in data as an enterprise asset and integration capabilities to help each employee become a more productive and better data analyst can have profound, long-term returns.

The challenge of the digital core is all about integration: the variety of data required; the dozens of technologies and vendors; and the people involved in a complex work process. In a recent SPE paper, "How Do We Accelerate Uptake and Fulfill the Value Potential of Intelligent Energy?", (SPE 181091 by H. Gilman, T. Lilleng, E. Nordveldt and T. Unneland), the authors described four types of integration that are needed to enable the digital oilfield as:

1)      transaction- oriented

2)      linking cross-disciplinary workflows

3)   collaboration between different locations (onshore and offshore and between similar assets in different basins)

4)      bringing together different time frames (matching history and real-time)

Who are the competitors in the battle for the digital core? What does winning look like? And most importantly, How are you going to manage the challenge of integration - the type we are talking about here? It is one thing to link one application to a specific data source with an API or a bit of ETL code, but the digital core is much more complex. Most programmers think functionality first and integration later, if at all. Commercial applications often create proprietary data models instead of trying to use industry or even company standards. Well understood, common definitions of critical information objects are often lacking, even between functions in the same company. Technology companies want to bring something new and cool to the market, therefore, avoid reuse of proven solutions or legacy investments. Upgrade churn and no backwards compatibility seem to be a cruel joke on technology users. Integration is an unnatural act in the digital oilfield, but it is a critical missing element. When the opportunity for collaboration comes, existing solutions often present more barriers than open doors. But I see that things are changing.

I see five different sets of technology suppliers trying to do a better job of integrating their own products and applications, but they are not stopping there. They all see the need and opportunity to be the first to provide the full asset lifecycle integration platform, or digital core, for the next generation of cross-functional, cross-asset, holistic enterprise solutions.  The platform will connect people, processes and technology, with the data foundation needed to become a data-driven, analytics-based organization. I will try to briefly describe what each of the five players (IT, OT or operations technology, ET or engineering technology, oilfield services vendors and back office ERP vendors) are trying to do.

First, let's take a look at IT. IT is the enterprise steward of shared computing and communications infrastructure and the database technology used across the enterprise. Their view of the digital core is a very technology-centric one. They have been responsible for structured databases, document management systems, email and messaging platforms, enterprise data warehouse and now the data lakes. They are the programmers that write the point-to-point applications links. The ones who have succeeded in building bridges between traditional technologies and the ones who have unintentionally created the complex spaghetti of links without a central design that reinforce the information silos we have today. IT is trying to leverage enterprise architecture design methods to belatedly develop an enterprise view of all digital assets and improve search and data discovery.

A new term is surfacing this describes this approach called the "data fabric." The collection of technologies that make up the data fabric enables the IT department to integrate, secure and govern various data sources through automation, simplification and self-service capabilities. The data fabric follows key work processes to create smoother information pathways for often used and high-value requirements to build a "get-my-data" portal for easier and a more intuitive access to data.

The need to make faster decisions requires that organizations incorporate real-time data into the process. Operations Technology (OT) includes process control, SCADA, historians and operational data stores, equipment health monitoring, predictive maintenance applications. For many years, this class of technology was the responsibility of electrical engineers and automation specialists and never crossed IT's path. But the proprietary technology has morphed to commercial IT technologies for lower costs, and the capability to bring the field data into the home office now exists, as does the ability to augment this data collection with mobile data capture methods. The process automation vendors see the digital core opportunity as do original equipment manufacturers (OEM), as the products they sell are getting more information-rich and services to optimize equipment performance brings a higher profit than just selling the equipment these days.

The surprise player in this space for me is Engineering Technology (ET). The facilities engineers have been diligently creating digital solutions for facilities design (3D CAD/CAM), construction and commissioning tasks often called the "digital plant." Vendors can talk about a "digital twin" of the actual physical facility and are proposing that this platform can be of value to operations and maintenance throughout the asset lifecycle. Here is where lessons learned from manufacturing can be leveraged by exploration and production. Document-centric work processes can be related to a dynamic 3D model of the topside processes and training can be enhanced by gaming technology and virtual (or augmented) reality visualization. They want to be part of the digital core as well.

Due to my geoscience background, I have a bias towards the familiar subsurface platforms offered by oilfield service software vendors. OFS solutions bring the subsurface characterization, reservoir simulation and drilling design and surveillance suite of tools, that are already widely adopted in the industry. A few tweaks to their existing platforms and they join the playing field in the digital core competition.

Finally, we can't count out the back-office transaction-oriented ERP platforms. This isn't an academic or geoscience or engineering world, it is a commercial world after all. Any asset manager has to bring in the information that financial transactions, procurement orders, regulatory permits, land contracts and HR records provide. From a financial perspective, all this activity is brought into the ERP system for final profit and loss reporting and review anyway, so the ERP platform has a role to play.

Designing and developing a digital core will not be an easy task (either from the technical elements, to the commercial adoption challenge). The industry not only has the task of looking at different vendors in each space but the potential of one, or several, platforms working together to become the digital core. This maybe a task that many companies are either too small, have limited capabilities or just want to leverage others, so that they can concentrate on their core business of producing oil and gas.

When you look under the technology cover to envision what the digital core might look like, I can see one of three scenarios playing out.

Facebook scenario: In this scenario, I can see one company dominating with a cloud computing infrastructure, offering the digital core integration capability as-a-service. You get assurance that your data is secure, you get a webpage portal (link) where you can discover your data and a workspace and tools to build your models. Off you go with only a monthly usage based charge to pay. Upgrades of software functionality are taken care of automatically, infrastructure additions are behind the curtains and not on your capital budget. You can focus on your business with the benefits of a holistic perspective of your asset performance and asset lifecycle. Sounds tempting, doesn't it?

Duplo scenario: It is possible that the digital core platform can be developed with a standards-based, inter-operability approach. Developing and adopting standards for data and integration will allow an operator to mix and match vendors and applications. They can assemble, rather than program, the parts they need and can host their version of the digital core inside their firewalls and still communicate with external stakeholders seamlessly. This sounds tempting as well, but the industry is going to have to put in some hard work in standards groups to make this viable.

Silo's win:  This scenario is the bad news. If it proves too difficult to agree to the inter-operability needed, open integration cannot be achieved - proprietary solutions will dominate, with too many vendors competing with differentiated functionality and no common core. Barriers to integration still stand, both from a commercial, cultural and technical perspective.  In this scenario, the digital core remains the recommendations of consultants (like me) on their PowerPoint decks, with little ability to help the industry.

So, what will the future bring? I wish I could predict, just as I wish I could predict the oil price in five years' time. I could make a fortune on the options market if my crystal ball was working well enough. I think the promise of the digital core is substantial and the efforts from the five technology markets tend to support that viewpoint. The Facebook scenario has many challenges and unless everyone agrees to buy this vision-as-a-service and let one company take the risks, a lot of collaboration will be required. Operators might want to pay the subscription fees only when the commercial digital core-as-a-service is ready enough for them and until that time, wait patiently on the sidelines. There are technology companies working hard to win their business. The Duplo scenario means we will all have to work together to bring this to fruition. The Silo scenario, we can keep doing what we have always done including complaining that we cannot find our data, create value from all the data we collect, and ultimately suffer the consequences of limited insight and lower productivity.

 

Post a comment

(If you haven't left a comment here before, you may need to be approved by the site owner before your comment will appear. Until then, it won't appear on the entry. Thanks for waiting.)

Please key in the two words you see in the box to validate your identity as an authentic user and reduce spam.