While time delays typically lead to poor control performance, and even instability, previous research has shown that introduction of time delays in controlling a dynamic system can, in some cases, be beneficial. This paper presents a new benefit of time delay control for single-input single-output linear time invariant systems: it can be used to improve robustness, as measured by increased stability margins. The proposed method utilizes time delays to approximate state-derivative feedback, which can be used, together with state feedback, to reduce sensitivity and improve robustness. Additional sensors are not required since the state-derivatives are approximated using available measurements and time delays. The method is introduced using a scalar example, then applied to a single degree-of-freedom mechanical vibration control problem in simulations to demonstrate excellent performance with improved stability margins.

This content is only available via PDF.
You do not currently have access to this content.