Engine cold-test cell drivelines experience large torsional oscillations in transient tests due to the excitation of system resonances by various engine harmonics. The excessive torsional vibrations result in structural degradation of the driveline components and also affect the fault detection process by preventing accurate measurement of gear noise and by compromising the quality of diagnostic torque waveforms. In this work, a torsional vibration model of an engine production cold-test cell is developed to analyze the vibration and diagnostic signal characteristics. The test cell driveline components are modeled using first principles. Waveform distortion metrics based on harmonic order amplitude ratios are devised to quantify signal distortion levels. A rigorous validation of the simulation model using both torque amplitude and waveform distortion comparisons of test and simulation data is conducted. Model parameters that can help suppress the torsional resonances in the operating range are identified through embedded sensitivity functions. It is shown that by modifying the inertia and stiffness properties of the rubber coupling, the resonant vibration problem of interest can be mitigated. The design modifications are implemented in a production test cell resulting in a significant reduction in torsional amplitudes and waveform distortion levels with a corresponding increase in the sensitivity to faults.

This content is only available via PDF.
You do not currently have access to this content.