The driveline components of engine cold-test cells undergo large torsional vibrations during transient tests as the system resonances are excited by various engine harmonics. The excessive torsional vibrations not only compromise the structural reliability of the system but also make the fault detection process difficult by preventing accurate measurement of gear noise and by compromising the quality of diagnostic torque waveforms. This paper presents modeling of an engine cold-test cell and a methodology to quantify signal distortion levels by using the proposed distortion metrics which are based on harmonic order amplitude ratios. A rigorous validation of the simulation model using both torque amplitude and waveform distortion comparisons of experimental and simulation data is conducted. The model is used to identify the driveline inertia and stiffness parameters that can help reduce high torsional resonant amplitudes as well as waveform distortion. The design modifications are implemented in a production test cell which helped to control the torsional vibration and diagnostic signal degradation issues with a corresponding increase in the sensitivity to faults.

This content is only available via PDF.
You do not currently have access to this content.