Insight hub: Pipeline data assessment and analysis

W. ALLANBROOK, Dynamic Risk, Alberta, Canada

The Pipeline Safety, Regulatory Certainty and Job Creation Act of 2011 (2011 Pipeline Safety Act) was enacted by the U.S. Congress on January 3, 2012.1 Section 23 of the 2011 Pipeline Safety Act requires the verification of records for pipes in Class 3 and Class 4 locations and high consequence areas (HCAs) in Class 1 and Class 2 locations. To address these and other requirements specified in the 2011 Pipeline Safety Act, the U.S. Department of Transportation (DOT) Pipeline and Hazardous Materials Safety Administration (PHMSA) published a final rule titled “Pipeline Safety: Safety of Gas Transmission Pipelines: Maximum allowable operating pressure (MAOP) Reconfirmation, Expansion of Assessment Requirements and Other Related Amendments” with an effective date of July 1, 2020.2 

Within this final rule, the PHMSA developed and published regulations under §192.607 pipeline material properties and attributes verification; onshore steel transmission pipelines and §192.624 maximum allowable operating pressure reconfirmation; and onshore steel transmission pipelines. Additionally, certain sections within Subpart M (maintenance), maintenance and operations gas transmission pipeline integrity management were revised or added. 

The requirements under §192.607, §192.624, Subpart M and Subpart O (gas transmission pipeline integrity management) rely on pipeline operators having reliable and accurate data related to pipeline construction, operations and inspection attributes. This article describes how pipeline material and attribute data may be ingested into a multi-operator data warehouse for the purpose of: 

  • Assessing the quality and reliability of an operator’s pipeline material and attribute data 
  • Analyzing an operator’s data against combined industry data 
  • Evaluating: 
    • The level of confidence in the operator’s data 
    • Specific opportunities for improving data quality and accuracy 
    • Specific types of pipeline integrity analysis that the data will support 
    • Specific data fields where the operator should focus data improvement resources to support additional types of pipeline integrity analysis. 

The data challenge. Effective integrity management and compliance with newly released regulatory requirements rely on accurate and reliable data. Many operators have data management systems (DMSs) that contain lengthy datasets representing their pipeline systems. These DMSs may include geographic information systems, maintenance management systems, other corporate data systems and publicly available information. In many cases, these datasets have evolved over decades originating with the conversion of hardcopy drawings and records. These datasets may have been augmented through additional record analysis [particularly identifying which data is supported by traceable, verifiable and complete (TVC) records], additional inspection results [e.g., in-line inspection (ILI) and direct assessment], repair data, aerial and satellite imagery (to update pipeline alignment) and other sources.   

In addition to data management system content, the processes and systems used to create, modify, establish TVC status, perform quality assurance and quality control (QA/QC), and organize transmission pipeline data for use have evolved. Some pipeline systems, and their associated data, may have been acquired through mergers and acquisitions and include uncertainties related to the patrimony of the data and associated quality and reliability. 

A unique challenge is attempting to understand the current level of quality and reliability of these evolving datasets and how processes and analysis that rely on these data may be impacted.  Depending on the pipeline integrity or regulatory compliance analysis being conducted, the quality and reliability requirements for specific data elements could vary. For example, confirming the MAOP as required in §192.624 requires TVC records for certain material properties and attributes. While operators have spent time and effort evaluating records and determining material properties that are supported by TVC records, questions related to the efficacy of the TVC process and the reliability of specific attributes or groups of attributes may remain. Other integrity analyses (e.g., external corrosion direct assessment, internal corrosion direct assessment, dent strain analysis, pressure cycle fatigue analysis, among many others), require material properties and other data fields to complete.  Due to the complexity of these datasets, it may be difficult to determine how to efficiently and effectively deploy resources focused on enhancing data quality. 

Assessing data quality and reliability. Data quality and reliability is measured for every field in an operator’s pipeline material and integrity data. Automated processes are used to dissect each data field and identify values that conform with industry norms and detect outliers. Each field is scored for data quality based on the consistency of the data. For example, yield strength may be expressed in various formats (TABLE 1). 

The American Petroleum Institute (API) Specification 5L Section 10.3.5 provides specifics on the format used to express pipe grade: the symbol is an “X” followed by the first two digits of the specified minimum yield strength in U.S. customary units (e.g., X42, X-42, API 5L X-42, API X-42).3 While these examples are understood to be congruent when viewed by the human eye, automated processes or systems that require data from this field to preform analyses may falter when confronted with such data inconsistencies. The consistency—or lack thereof—in the use of a specific value is an indicator of the outcome of QC performed during data loading and maintenance processes. 

Insight Hub contains more than 144,000 km of transmission pipeline data from more than 20 operators.  Data quality is measured for every data field through an automated process of comparing the unique values within an operator’s data to industry norms. While pipeline systems may have unique characteristics, knowing where material and attribute data fall outside industry norms provides actionable indications of potential data inconsistencies or errors. 

Data validation. Data validation is further analyzed by comparing the combination’s material properties and attributes to trusted sources.4 Comparing pipeline populations as defined under §192.607(e)(1) or other combinations of material and attribute data with trusted sources and identifying unlikely or impossible combinations provide additional indicators of the quality and reliability of material and integrity data. 

Evaluation. Combining strong data with rigorous risk models supports actionable risk mitigation strategies. Synthesizing the results of data quality and reliability assessments with the validation of data against industry norms identifies the key strengths and uncertainties associated with an operator’s pipeline system data. Quantifying these results provides operators with a basis to understand the reliability of analysis based on this data and where results may contain uncertainty. This is particularly important when performing engineering critical assessments to reconfirm the MAOP under §192.624(c)(3), among other pipeline integrity analyses. 

Characterizing uncertainties provides direction for the development of action plans targeting the enhancement of data quality for specific subsets of data. This leads to measurable changes in the reliability of results. Uncertainties within material and attribute data also indicate certain integrity analyses may not be possible; for example, if coating data is unreliable or unavailable, the resulting predictive analysis requiring coating data as an input contains uncertainties or the analysis may not be possible. This knowledge—in addition to an understanding of the challenges operators face associated with a pipeline—leads to the identification and prioritization of data fields that will support additional types of integrity predictive analysis and lead to improvements to pipeline safety. 

Takeaway. Addressing the data challenge by quantifying an operator’s current level of data quality and reliability representing their pipeline systems and the impact on processes and analyses that rely on these data result in key insights. These insights provide a basis for planning enhancements that will measurably enhance data quality and increase the reliability of integrity analysis processes. 

LITERATURE CITED 

1 Authenticated U.S. Government Information, “Pipeline Safety, Regulatory Certainty and Job Creation Act of 2011,” 2012, online: https://www.govinfo.gov/content/pkg/PLAW-112publ90/pdf/PLAW-112publ90.pdf 

2 Federal Register, “Pipeline safety: Safety of gas transmission pipelines: MAOP reconfirmation, expansion of assessment requirements, and other related amendments: Response to a joint petition for reconsideration,” July 6, 2020, online: https://www.federalregister.gov/documents/2020/07/06/2020-14403/pipeline-safety-safety-of-gas-transmission-pipelines-maop-reconfirmation-expansion-of-assessment#:~:text=On%20October%201%2C%202019%2C%20%2884%20FR%2052180%29%20PHMSA,improve%20the%20safety%20of%20onshore%20gas%20transmission%20pipelines 

3 API, “API Specification 5L,” 46th Ed., online: https://www.api.org/products-and-services/standards/important-standards-announcements/standard-5l 

4 ASME, “History of line pipe manufacturing in North America,” 1996, online: https://www.asme.org/publications-submissions/books/find-book/history-line-pipe-manufacturing-north-america/1996/print-on-demand-books 

Related Articles

Comments

{{ error }}
{{ comment.comment.Name }} • {{ comment.timeAgo }}
{{ comment.comment.Text }}