The consequences of a dynamic fracture in a gas-transmission pipeline require that they be designed to avoid such incidents with great certainty. Because of the complexity of this fracture process, the only certain approach to determine fracture-arrest conditions involved full-scale experiments. As time passed empirically calibrated balance equations between the crack-driving conditions and the line-pipe steels crack-arrest capabilities were developed. Such models worked well until the introduction of high-toughness line pipe, for which to full-scale test predictions were non-conservative, and increasingly so as toughness increased. Problems with early CVN-based models led to development of alternative schemes.

This paper presents results of experiments done to evaluate plausible alternatives to the CVN practice, which rely on an impact test identical to or adapted from the drop-weight tear-test (DWTT). As this practice is comparable to that of the CVN practice save for using an up-scale specimen geometry, results are presented and contrasted for these test methods, for pipe grades from B to X70, and toughness from less than 10 J in excess of 300 J. Data are analyzed to reveal trends not typically reported for such testing. It is shown that there is no essential difference between data developed from the CVN and DWTT practices, provided the results are compared at similar levels of impact-machine excess-energy capacity. Further, it is shown that non-conservative predictions of full-scale test behavior for higher-toughness steels can be traced to using the early CVN-based models at toughness levels well outside the range of their empirical calibration.

This content is only available via PDF.