Deterministic design and assessment methods are by definition conservative. Although no claim is made regarding the actual reliability level that is achieved using deterministic, i.e. safety-factor based approaches, the safety factors have been selected such that generally sufficient conservatism is maintained. Reliability-based methods aim to explicitly quantify the aggregated conservatism in terms of failure probabilities or risk. Accurate reliability estimates are not possible without accurate computational prediction models for the limit states and adequate quantification of the uncertainties in both the inputs and model assumptions. Although this statement may seem self-evident, it should not be made light-heartedly. In fact, just about every analysis step in the pipeline integrity assessment procedures contains an inherent, yet unquantified, level of conservatism. One such example is the application of a “maximum” corrosion growth rate that is constant in time.

A reliability-based framework holds the promise of a more consistent and explicitly quantified safety level. This ultimately leads to higher safety efficiency for an entire pipeline system than under safety factor based approaches. An accurate prediction of the true likelihood of an adverse event is impossible without significant research into determining and understanding the, usually conservative, bias in the engineering models that are currently employed in the pipeline integrity state-of-the-practice. This paper highlights some of the challenges that are associated when porting the “maximum corrosion rate” approach used in a deterministic approach to a reliability-based paradigm. Issues associated with both defect and segment matching approaches will be highlighted and a better corrosion growth model form will be proposed.

This content is only available via PDF.
You do not currently have access to this content.