Industry Insights

In uniform

Oil and gas industry adopts standard model to ensure data integrity.

In the past ten years, the oil and gas industry has been turning to geographic information system (GIS) technologies as the backbone to efficiently construct and maintain pipeline systems. The effectiveness of these solutions depends on the degree of detail, reliability and relevance of the data describing the condition and functioning of the unified gas supply system. Acquiring, systematizing and updating GIS data, then unifying and classifying that data, should be performed within the framework of an integrated corporate spatial data infrastructure.

For this collaborative environment to be useful, the key is unifying the data’s meaning with the shared technologies needed to acquire and access data. Because each department and business unit is a monopolistic holder of its data, each is also responsible for its maintenance, formulating its own data acquisition and analysis procedures, performing classification tasks, developing the necessary software and solving format compatibility problems. The business unit must also integrate and generalize the data and determine spatial referencing. These challenges necessitated an integrated spatial data infrastructure capable of more effectively acquiring, processing, storing and leveraging the spatial data from oil and gas production and transportation facilities. Such integrated data sets or databases enable optimum management of the pipeline companies’ assets, as well as the development and integrity of the pipeline systems.

Settling on a model

The oil and gas industry made several attempts to implement relatively inexpensive solutions aimed at leveraging existing data and software. Eventually the industry settled upon a standard data model to collaboratively acquire, integrate and analyze raw data from service companies, software developers, and suppliers and system integrators. With this collective data, any number of departments can access strategically important information for various purposes. For instance, the pipeline data model provides direct access to the location, suppliers, warranties, work done and operation history for each pipeline component. With this detail, operators can plan resource use; develop work schedules; more accurately determine requirements for specialists, equipment, materials and tools; and prepare documents in accordance with industry and federal rules and standards.

These capabilities help improve service continuity and gas supply stability by combining preventive scheduled maintenance and on-condition maintenance efficiently. In the end, using database information increases pipeline reliability and throughput, and minimizes accident risk, service interruptions due to failures, and operating costs.

Furthermore, by updating the database with information obtained during direct and indirect instrumental and visual inspections of the gas line, operators can determine when and where an excavation is needed to inspect the pipe in greater detail.

Teradata becomes a member of the PODS Association

As a way to allow major oil and gas pipeline operators to quickly migrate from paper and legacy databases into a standards-based data model, the PODS Association created the Pipeline Open Data Standard (PODS) Data Model. The most widely implemented pipeline data model in the industry, PODS defines the data format that uniformly and precisely visualizes the design and technical condition of various pipeline systems.

A unified structure of cross-country pipeline data, PODS creates a basis for joining efforts of service companies, software developers and suppliers, and system integrators. This collaborative endeavor helps establish effective information systems on various lines of activity of oil and gas companies.

PODS is implemented as a relational database in Oracle, Microsoft SQL Server, and Teradata Database. The model can also be spatially enabled providing tight integration with leading geographic information system software platforms. The PODS Association is a nonprofit pipeline trade association whose members include pipeline operators, service providers, data providers and governmental agencies that work together to develop and manage data standards for the pipeline industry. Teradata has been a member since January 2009.


It’s in the details

A standard gas-and-oil database contains various sets of data used to evaluate possible trouble sources:

  • Pipeline elements, such as diameter, wall thickness, pipe manufacturer and connection types
  • “As-built” data, with information on installation date, welding procedure, backfill thickness and soil, coating, hydraulic test data and inspection reports, etc.
  • Operation data, including cathodic protection condition, pipe wall temperature, corrosion data, leaks and repair reports, gas composition, operating pressure and flow rate

Using the information gleaned from the database, the operator implements a method called Managing Pipeline Integrity. Based on the adopted data model, the method enables the operator to evaluate pipe metal losses, including in-line flaw detection and hydraulic testing. The evaluation results are integrated with the data on other devices and the gas supply schedule information and put into the same database for further use.

For example, when the designer issues to the contractor the usual detailed design documents and project database, the database tables list the structural elements, material types and grades, types of equipment, and their linear and geodetic coordinates. The contractor can use this data to prepare the work statement, purchase materials and equipment, identify necessary resources, specialist and tools, and compile the work schedule.

During construction, the contractor or subcontractor enters into the database quantities of materials, works performed, and types and geodetic coordinates of installed parts, structures and other equipment. The subcontractors submit the findings of hydraulic, insulation and electrochemical protection system tests, in-line flaw detection data and stress-test results. As a result, once the pipeline construction is finished, a complete database reflecting its condition and the history of its creation will be prepared, turned over to the operator and used for pipeline integrity management during further maintenance.

Better information

The pipeline system databases integrated with computing, information analysis software and electronic workflow tools will ensure data management throughout the pipeline life cycle. From the data acquired during the new project field investigations to the storage of the “as-built” survey, integration of the data gathered during condition inspection and operational control of gas supply, risk assessment, development of the equipment maintenance and replacement strategy will provide operators with better information for critical decision making.

Your Comment:
Your Rating:

Fuzzy Logix