Many oil and gas companies are rushing to layer in new digital technologies such as cloud, machine learning, predictive analytics before building their data and analytics foundation. Pipeline companies must first enact data management and analytics strategies to prepare the foundation for future technology success says Stuart Parker at AVEVA.
The global pipeline market picture is more nuanced and unpredictable than at any time in history. The industry is presented with complex new challenges, as well as vast opportunities.
An intelligent pipeline strategy enables companies to create new capabilities, new business models, and innovate ahead of the competition. By deploying information management systems, powerful analytics, automation of workflows, and driving behavioural change in workforce, oil and gas companies can evolve and change how work is performed to build a sustainable, profitable organisation.
Companies must strive to reduce operational costs and their carbon footprints across complex value chains, including a growing and ecosystem of business partners.
While SCADA and production measurement software provide features to identify issues with measurement, there are opportunities to perform analytics on measurement and raw data to provide earlier indications of issues.
Forward-looking oil and gas enterprises are installing intelligent pipeline frameworks to turn massive amounts of data into wisdom that generates business value. By using existing operational data as well as new data sources, companies can take a model-focused approach that puts them on the path to operational excellence.
Data with business context
Oil and gas pipeline companies were collecting huge amounts of operational data long before the term Industrial Internet of Things, IIoT was coined. However, turning vast amounts of raw data from SCADA, pipeline applications, ERP systems, and more into contextualised information around equipment and processes is often challenging. Contextualising this data ultimately enables operational improvement.
Not only does a wealth of raw data, devoid of context, structure, or quality, rarely pay dividends, those tasked with utilising that data often find it difficult and cumbersome to extract insights. If users are too slow to develop and implement sustainable solutions, the company will accrue significant lost opportunity costs.
When unstructured operational data builds up in data lakes, traditional IT technologies can create more problems than they solve, as businesses must spend more time wrangling data than using it to deliver business value.
Unfortunately, many companies are rushing to layer in new technologies and solutions such as cloud, machine learning, edge, IIoT, and predictive analytics before building the right data and analytics foundation.
Adopting these new solutions can potentially deliver new and valuable insights, but pipeline companies must first enact solid data management and analytics strategies. Deploying an enterprise-level, real-time data management platform lays the foundation for future technology success.
Generating value from data
To produce actionable intelligence, data must be structured and accessible to those who can best use it, particularly subject matter experts who have the knowledge and experience to put data insights into action.
Digital transformation success hinges on having a single source of truth. Operations data must first be standardised and contextualised before it can be analysed and visualised. Comprehensive data management systems can lay the foundation for operations data integration, data validation, and analytics.
A centralised operations data management platform uses standardised and templatised tag-naming conventions and assets are catalogued in a flexible hierarchy. This platform becomes an operational system of record, creating the foundation to democratise insights across any pipeline business model.
Using the data model, companies can accelerate digital transformation by combining operational data into a digital replica of physical assets. This can be enabled by developing an digital twin of the entire system using information such as drawings, 3D models, materials, engineering analysis, dimensional analysis, real-time pipeline data, and operational history.
During the operational life cycle, the digital twin is updated automatically, in real time, with current data, work records, and engineering information, to optimise maintenance and operational activities. Engineers and operators can easily search the asset tags to access critical up-to-date engineering and work information in order to diagnose the health of a particular asset.
Previously, such tasks would take considerable time and effort, and issues were often missed, to failures or pipeline outages. With the digital twin, operational and asset issues are flagged and addressed early-on and the workflow becomes proactive instead of reactive. Pipeline companies can easily benchmark operational performance, such as pipeline throughput and energy consumption, to uncover gaps and improve pipeline efficiencies.
Use cases for pipeline data
Advanced simulation and analytics tools can be used to model and predict fluid flows in pipeline. This not only allows product and batch tracking and line pack, but also helps uncover improvement of throughput in existing assets. This improved visibility into operations enables pipeline operators to optimise throughput. What’s more, they can plan for future infrastructure expansion to increase efficiencies and throughput to improve competitive advantage.
Advanced simulation tools can be used to model gas flow behaviours and predict loads for current and future gas days in near real time. These insights enable pipeline operators to better balance supply and demand, optimise capacity, and better adhere to gas contracts.
Quickly identifying pipeline leaks is key to minimising risks and preventing major spill overs, and many leaks go undetected by a single solution. There is no one-size-fits-all solution, and multiple detection technologies are often needed to detect different types of leaks that exhibit dissimilar flow patterns. Analytics can simulate liquids and gas flow and detect any subtle changes that could point to a pipeline leak.
Training controllers are critical for ensuring operational safety and integrity while making sure all operations are adhering to a pipeline operators’ safety and compliance program. The pipeline training simulator is an operator training system that allows pipeline controllers to train on normal and abnormal operating scenarios in a safe and realistic environment.
Operators can navigate actual pipeline operations and receive certifications before they assume the roles and responsibilities as a controller in a live operating environment. Major oil and gas operators have significantly reduced training costs and time to proficiency by using simulators as part of enterprise-wide training programs.
The accurate measurement of volumes is extremely important to ensure correct and timely accounting, both internally and at custody transfer points, and pipeline companies must use available data to improve accuracy.
Through analytics, expert systems, and artificial intelligence, AI, future operational systems will become even more automated, from the scheduling through to product delivery. This level of autonomous control will be enabled by better data for operator decision making, followed by recommendations and what-if scenarios, and will gradually result in automated control during normal, and eventually abnormal, operations.
Pipeline companies are starting to make use of action sequences which allow canned sets of commands chained together with pre-and post-action verification to ensure the commands are executed safely and correctly. Utilising pipeline data analysis to drive actions using a rule-based approach can enable pipeline companies to improve outcomes when performing operations where timing of the actions is critical.
The oil and gas pipeline industries are under tremendous pressure to adhere to constantly changing regulatory requirements, increase agility to succeed in a rapidly changing business climate, protect cybersecurity concerns, and meet global sustainability goals.
Click below to share this article