The concept of entropy in information theory is used to investigate the sensitivity and the stability of sampled-data systems in the presence of random perturbations. After a brief background on the definition, the practical meaning and the main properties of the entropy, its relations with asymptotic insensitiveness are exhibited and then some new results on the sensitivity and the stochastic stability of linear and nonlinear multivariable sampled data systems are derived. A new concept of stochastic conditional asymptotic stability is obtained which seems to be of direct application in the analysis of large-scale systems. Sufficient conditions for stability are stated. This approach provides a new look over stochastic stability. In addition, variable transformations act additively on entropy, via Jacobian determinant, and as a result the corresponding calculus is very simple.

This content is only available via PDF.
You do not currently have access to this content.