How big data from pipeline pigs can root out gas maintenance risks

Not all gas pipelines can be serviced by maintenance pigs, but the sensor data from ones that can are providing mountains of vital information.
Written by Stig Øyvann, Contributor

Only a minority of the world's pipelines can be cleaned with pigs like this one.

Image: Getty Images/iStockphoto

Around one-third of the world's gas pipelines can be inspected using 'pipeline pigs'. These cylindrical devices have figured in three James Bond movies, in one case being modified to secretly transport a person across borders.

Back in the real world, the problem is that the other two-thirds of the world's gas pipelines can't be inspected using pigs, so their maintenance has to be based on the theoretical, computed lifespan of the pipe's components.

Originally, pigs carried simple steel brushes for cleaning the inside of pipelines. But these days they can also be crammed with sensors to measure the state and condition of the pipeline. Among other things, they can measure the thickness of the pipe wall, to control when the pipeline needs maintenance due to corrosion.

Without those sensors, pipeline operators have to make very conservative calculations to keep the structures safe and operational. This approach results in unnecessarily frequent maintenance and hence unnecessary high costs for the operator.

A better way may be to use predictive maintenance programs using machine learning and big data gathered from the pipelines that can be measured with pipeline pigs. That's the view of specialists at Norwegian firm DNV GL Software, the software business unit that emerged from the Det Norske Veritas-Germanischer Lloyd merger in 2013.

A pipeline pig with instruments can generate vast amounts of data. "The pigs that measure pipeline wall thickness comes in two varieties: one creeps through the pipe and uses ultrasound to measure at regular intervals, the other is based on magnetism, and measures more continuously. This generates large amounts of data. We're talking terabytes here," Jo Øvstaas, head of design and engineering at DNV GL Software, tells ZDNet.

To model corrosion, the wall thickness of the pipeline itself is a crucial data point, but there are also several other relevant data sources.

"We're combining the data from the pigs with soil data, among other things, the pH values where the pipeline is lying on the ground, because this is relevant for corrosion. We've got data from weather stations nearby, and other information we know is tied to corrosion in some way," Øvstaas says.

The specialists at DNV GL Software saw the potential in their available pipeline data, and started a study on how to make practical use of them. Predictive maintenance based on machine learning soon became an attractive opportunity, according to Øvstaas.

"Just a few years ago, machine learning put very high demands on data analysis competence and dedicated hardware, as well as requiring deep knowledge of statistics and probability calculations. Today, the technologies have matured, and you use a specialized tool for this. We use tools like Microsoft Azure Machine Learning Studio and Azure ML Cheat Sheet for this," he says.

DNV GL did a five-day hackathon with these tools and data, and was surprised how good the available software has become. "We've come this far because we have a very good algorithm that hits true, measured data very well. It has a high hit rate," Øvstaas says.

Work remains to commercialize this technology. DNV GL has focused on the technology side. The true business value, such as the cost savings for pipeline operators, for instance, hasn't been calculated yet. However, the company has no doubts of the potential of this technology.

DNV GL director of digital transformation Tormod Svensen says the firm is already working with certain customers on the new product. "We'll commercialize this technology before the end of this year," he says.

Read more on analytics

Editorial standards