Integrity management is based on the ability of the pipeline operator to predict the growth of defects detected in inspection programs on an operating pipeline system. Accurate predictions allow targeted interventions to be scheduled in a cost effective and timely fashion for those defects that pose a high potential risk. In this paper two distinct theories are described for predicting the development of corrosion pits on an operating pipeline. The first theory corresponds to the traditional approach in which the past growth behaviour of each defect is used to predict the rate of its future development. In this theory each defect is assumed to have its own unique corrosion environment in which only a very limited range of corrosion rates will be seen. In the second approach, this assumption is not made. Instead any corrosion defect is allowed to grow at any likely rate over any time interva. In this approach an arbitrary selection of corrosion rates derived from the overall profile of past rates seen for all defects is applied to each defect over time. Predicted distributions derived by computer simulation of the initiation and growth of corrosion defects according to each theory have been compared to an actual defect depth distribution derived by in line inspection (ILI) of an operating pipeline. The success of the two models is compared and implications for pipeline integrity management are discussed.

This content is only available via PDF.
You do not currently have access to this content.