A digital reality for oil and gas
July 4, 2019
by Andy Nelson, Senior Software Engineer
Published in Energy Voice, July 2019
Good data analysis can have a direct impact on operational efficiencies and the potential reduction in operating costs as well as improved production and hydrocarbon recovery. From edge computing devices to artificial intelligence and machine learning, the scale and variety of data and analysis technologies is widely available, yet application is still relatively limited.
The desire to fast track the industry towards the adoption and implementation of a digital oilfield has never been greater. However, deploying a solution to extract the economic benefits of digital data can come with an abundance of technical and logistical challenges.
Simply moving this volume of data from the point of generation to a location where analysis can be performed is the real data challenge. Whether the well is located on an offshore platform or remote land-based, the hard-wire technology infrastructure to move these massive volumes of data either may not exist or is only available in a minimal number of facilities. Evolving wireless technology standards are helping to improve data transmission speeds, but the growth of data generation significantly outpaces the ability to transmit that data.
Despite the best laid plans and processes, deployment issues only serve to deter uptake and field acceptance. Even wells in good geographical locations or with reliable, high-speed access can also face significant technical hurdles. Regardless of industry efforts for standardised protocols, the integration of diverse digital systems from a myriad of suppliers, whilst trying to sustain legacy data, continues to be problematic.
Implementing the digital oilfield does not need to be an all-encompassing effort that integrates every aspect of the workflow on the first pass. Success can be incrementally achieved by focusing on smaller projects and deploying them in a progressively integrated way. At the same time, each individual project phase should feed into the larger digital oilfield strategy.
Careful selection of vendors that conform to industry standards will aid in systems integration and allow for ‘best-of-breed’ products to be utilised in the final solution. Testing of each integration point, and the entire workflow, should be a standard practice to ensure that data is being correctly used and provide assurance that errors are not filtering into the data set.
Inevitably, the selection of certain products, the deployment location and technological limitations will mean that creative, integration strategies will be needed to be adopted to complete certain phases. Having a planned obsolescence strategy will also ensure that each component of the digital oilfield will not become an isolated legacy technology problem and that future capabilities can be introduced as they become available.
Too often, the very complex nature of the digital oilfield itself can hinder its introduction and therefore slow the pace, performance and profits it can ultimately bring. Through good planning, creative thinking and disciplined implementation, the digital oilfield is achievable.
None of this precludes the need for human intervention and oversight. Computers aren’t good at solving complex problems, but humans are.
Where computers shine is crunching through masses of data, looking at historical actions that have been acted upon, what those outcomes are, and then making data-driven recommendations. Humans can become overwhelmed with volumes of data relatively quickly, but computers, given the right circumstances, can make the most of that data and leave the workforce with solving the problems and making the critical decisions that the data presents.