« January 2017 | Main | June 2017 »

February 23, 2017

Operational Technology (OT) vs. Information Technology (IT)

One of the many themes coming out of new digital technologies is the concept of operations technology or OT. This theme is more than just a new suite of technologies, sensors and smart equipment, but a different paradigm coming more from the world of control systems and field automation rather than from corporate IT. How will these new developments and the new data coming from field instrumentation fit into the world of digital data defined by IT and structured data management practices?

First some definitions. I would like to define the two (maybe three) worlds of digital technology in terms of their areas of traditional responsibility.

First, the corporate Information Technology who has responsibility over the data center, networks (WAN & LAN), desktop support, enterprise application platforms (Finance, HR, Supply Chain, Marketing), and cybersecurity. This community is usually led by a formal department/functional executive called the CIO.

Then comes this new (but really not so new) world of Operations Technology who has stewardship for engineering applications, operations and field automation (SCADA) systems. This area is rarely a corporate department. COOs in the oil and gas industry are usually assigned to business unit or assets in a local geographic area. The growth of OT is happening "ground up" so to speak. Some companies have field automation standards but with legacy properties and many mergers and asset acquisitions, there is a complex diversity of solutions found in the field. This community is usually driven by local champions and operational teams.

Their connection to corporate IT traditionally have not been very formal or visible. But they do have a number of common issues such as: telecommunications, protocols, data access, architecture, mobility, and cybersecurity. Often these groups are struggling to find common solutions for patch management, upgrade and version changes and ways to bring data to engineering teams.

Making this convergence more difficult is the existence of a third community, the world of Shadow IT. This is not a formal community at all but an informal collection of highly digitally literate engineers and operators armed with desktop productivity tools and Excel, "personal" databases, Access based solutions, shared drives, and Visual Basic macros. This "Innovation on the Edge" approach often competes quietly with the "standardization from the center" initiatives from corporate IT.

The Digital Oilfield is forcing these communities to find ways to collaborate. This comes from two trends: Digital Intensity (increase in number and variety of sensors, field automation, smart equipment, increase in documents, increase in size of seismic surveys and reservoir models) and Interconnected Devices (Remote Decision Support Centers, remote control of processes, decrease in the use of proprietary networks and growth of internet, plus connected Supply Chains).

There is no question that more information collected by operations is needed to show up on higher management level dashboards. Remote surveillance and monitoring requires a view of current conditions in near real-time but in the context of the past (and the plan) to help operations become more predictive and proactive. With more data being collected, more interconnections being made between equipment, processes and humans, a secure, faster response cycle is called for as well as a full asset lifecycle perspective if optimization algorithms are going to have an impact.

Many advances in the Industrial Internet of Things favors OT over IT, but all the data needs to ride on a common ICT backbone. With the increasing number of interconnections, a total security solution is needed. If sensor and machine data mixes with transactions, documents and structured data, then data management solutions must mature. The current tensions and often separation between OT and IT have to evolve into converged approaches.

Additionally, using the internet to transmit the data has exposed a new and large external risk. The recent, large-scale distributed denial of service attack (DDoS) on Dyn, (a New Hampshire-based company that monitors and routes internet traffic) knocked out many major commercial websites for several hours. It brought to the forefront the risk of IOT and IIOT devices and how they can be accessed and used in unanticipated ways including potential terrorism.  So, this raises the stakes to the question - who is responsible for security? IT- OT or Both?

Where does that leave us? OT versus IT, or OT plus IT? Are they competitors or collaborators? All these advances are going to make life interesting for all communities involved. It is time to make sure there is only one team, in order to enable the digital oilfield.

February 1, 2017

In Defense of Silos

The Oil and Gas industry is frequently criticized as being a slow adopter of new technology and of innovations proven in related industries. We are also criticized as an industry that works in our functional and geographic silos and are reluctant to share data. This laggard behavior (we don't like that term and suggest we are "fast-followers") and our parochial behavior (we don't like this term either and would replace it with the term "functional excellence") around data and technology often creates barriers to information sharing. Data integration and even information protection are often late add-ons creating a complex architecture behind our firewalls.

But rather than taking the easy way out of just criticizing the industry, let's take a moment to try to understand why we work in silos. We are not incompetent or lazy people, so there must be a very good reason why we work the way we do. An understanding of the basic drivers might just give us the insight we need to do a better job of data access, data integration and holistic data analysis.

My beginnings in this industry started at university. I got my degree in geophysics where I spent most of my last two years for my B.Sc. and all the time getting my M.Sc. in that department. Yes, I did take some geology classes and several engineering classes but my "silo" training started pretty early. When I joined a major oil company, I was assigned to a functional (geophysics) unit and worked with other earth scientists. While there wasn't a total ban on communications between departments, we did usually have to go up the organization chain to get permission to talk to someone from an allied function (another silo) or get data from them. In those days' not much of it was digital so it was usually to check out a report or a well file from the central library (remember when companies had those?).

Our data was stored and managed in fit-for-purpose geophysical data bases. Our primary data, seismic, was unique and the way we collected, processed, interpreted and stored that data was unique as well. Our enterprise information strategy was purely functional. For us the enterprise was the geophysical department and we were comfortable in our silo.

In the 1990's organizations figured out that an integrated view of the subsurface, focused on the producing reservoir, would be a better way of working. The concept of the asset team was developed and a real organizational change moved quickly through the industry (at least through the geophysics, geology and reservoir engineering departments). We reorganized into this new thing called asset teams (usually with a PE supervisor in charge) and we began working in a more cross-functional way. Remember that this new concept didn't include drilling, operations and maintenance, facility engineering, land or finance. Those silos were still intact.

Data management and technology, as well as work flows, had to change as well. We needed more transparency between the seismic perspective, the geologic perspective (usually derived from well logs and analog patterns based on surface geology studies) and the growing field of reservoir simulation (fluid flow through porous media). Our technology and information practices evolved. Soon we could bring well logs into the seismic interpretation (remember synthetic logs converted to time?). While it was still a chore for the data managers, we could bring the common earth model into the reservoir simulation application and complete the loop of seismic-to-simulation. This still took a lot of time and effort but when the business driver changed, technology changed and helped up to provide better answers.

I hope that my personal story helps to set the context for why we tend to work in silos. Our functional problems are still pretty sophisticated. While there are simpler versions and more complex versions of software for this integrated subsurface characterization workflow, there are advantages of doing parts of it within a function and then have a hand-off to the next stage of the value chain. With asset teams, there has been a trend to more generalists and fewer specialists, but the concept of getting the detailed work right still is important. Someone has to understand the details. Someone has to be the expert.

This story is repeated in every function in the E&P value chain. Drillers need to get the design and construction of the wellbore right, even though today they often use a seismic cube to help guide the complex well path. Facilities engineers need to design the right platforms and get the process flows right, even though today they will often use the historical performance of equipment coming from operations and maintenance groups. The same goes between production engineers, reservoir engineers and field operators getting the highest level of production from the field.

We are recognizing the importance of cross functional workflows. We understand that we need to enrich our functional models with data from related disciplines. But to criticize our silos is to criticize the way we have (and usually still are) working and have been taught. It criticizes our functional excellence pride and professionalism.

Of course, the consequence of our silos leads to the productivity barriers from a more holistic asset performance view. We spend too much time finding, fixing and reformatting data and because we get that, we are trying to build better data bridges and data portals so that sharing of data is easier. We are beginning to understand and develop a common data vocabulary for critical information objects (like PPDM's "What is a Well" and PODS' next generation pipeline data model) when we are speaking to each other. We understand we need standard data exchange technology (like Energistics' XML data protocols) to make it easier to work across the boundaries of discipline, geography and often companies in the supply chain.

Our silos are the home for our expertise, our foundation on which we add value. Our silos contain the tools we use that are designed for the work we do. Our silo behavior is the not the reason for slow adoption of the latest and greatest technology out there. Starting with an understanding of the benefits of the silo culture with an eye to the future, technology companies can work with us to develop efficient, effective and easy to use data exchanges.