In-Line inspection tools (ILI) including Magnetic Flux Leakage (MFL) and Ultrasonic (UT) technologies are commonly used to detect/measure potential anomalies in oil and gas pipelines. Some of ILI reported anomalies are usually selected for excavation and validated through field non-destructive examination techniques (NDE). It is a fact that both ILI and NDE readings are contaminated with measurement errors. Such errors are usually originated from inherent tool limitations and capabilities, measurement techniques, and/or human factors. The intend of this paper is to calibrate the corrosion ILI data relative to NDE measurement given estimated statistical errors from both tools. Commonly, a graphical representation is used to compare ILI versus field measurements; namely, a unity plot. Herein, a linear relationship between ILI and NDE measurements is assumed. Such an assumption leads to another assumption of having a linear relationship between the ILI measurement and true value. Similarly, NDE measurement has the same relationship with the true value. An advanced statistical approach based on linear regression and maximum likelihood is used to determine the uncertainty of both ILI and NDE measurement errors. This method is based on first quantifying the uncertainty of ILI and field measurement and then calibrating the ILI data relative to the field using the estimated tools errors. The tool error estimation is based on reducing the relative error between ILI and field measurements. The calibration methodology implements advanced statistics to improve both accuracy and precision of measurement data. The proposed process is validated through results from successive ILI programs. The proposed calibration can be easily adopted in ubiquitous computing spreadsheet environment and be applied to both corrosion and crack measurements.

This content is only available via PDF.
You do not currently have access to this content.