Abstract
The emergence in the early 1970s of what about a decade later became the first release of ASME B31G began the development and evolution of criteria to assess the severity of metal-loss defects. Motivated by the desire to reduce the conservatism embedded in B31G, the late 1980s saw the release of Modified B31G, with that same report also introducing RSTRENG, which quantified “riverbottom” effects. The desire to avoid excessive conservatism in their application to higher-strength Grades gave rise to alternative criteria for such applications. PCORRC appeared in 1997, with early versions of DNV RP-F101 and British Gas’ LPC-1 criteria following shortly thereafter. It has since become evident for isolated smooth-bottomed features that in addition to feature length and depth, its width can be a factor, as can its planar shape, and through-thickness profile.
This paper builds on insight gained from the prior work, presenting and validating a Level 1 failure criteria for isolated metal-loss features. The defect-free term for this Level 1 criterion relies on the Zhu-Leis criterion for defect-free pipe failure. That criterion is coupled to a recalibrated defect term analogous to PCORRC, whose extension to include the effects of width is considered. The resulting Level 1 criterion is validated in reference to full-scale tests of pipe with metal-loss, which include a mix of real corrosion and flat-bottomed machined features. These tests consider Grades from Gr B to X100, a wide range of diameters and thicknesses, and in many cases the effect of width. Finite element results are used to illustrate the role of width. Benchmarked against almost 80 full-scale tests it is shown that this new approach affects a reduction in conservatism. At the same time, it provides clear benefits in regard to reduced predictive scatter, as well as a reduction in required maintenance, and the scope of features that must be considered in field-digs.