In this paper we make two key contributions. First, we formalize the effectiveness of fault detection and isolation (FDI) with a metric that globally considers the following: variance in engine parameter estimate residuals under normal conditions, costs of missed detections and false alarms, costs associated with misclassification of faults, fault frequencies and fault severities. Reducing the error variance increases the signal-to-noise ratio, thereby increasing the reliability and speed of fault-detection algorithms. Minimizing missed detections has enormous implications on operational safety, while minimizing false alarms and fault misclassifications has implications on downtime, asset management, cannot duplicates, and operational costs. This metric measures the trade off between reducing data error variances, between false and missed detects, and misclassification of faults. As a second contribution, we embed this metric in a systematic data-driven diagnostic optimization process for normative decisions on input parameter selection for residual generation, FDI methods and prediction/classification fusion techniques.

This content is only available via PDF.
You do not currently have access to this content.