This article discusses different reasons of failure of engineering systems and how such failures can be avoided. It is human nature and economically attractive to discount the low-probability, high-severity consequences in the design and development of complex systems. Discounting or ignoring the effects of worst-case scenarios, however, can lead to a culture of complacency that heightens risks. Failure due to poor development can be traced to a lack of organizational commitment to systems thinking. An example of this is a lack of communications between designers and end users. System designed without the user in mind and without regard to human factors for safe operation and maintenance can have disastrous results. Overly complex user interfaces in both hardware and software systems are common points of failure. Failures due to a lack of training should not be considered the fault of the individual operator, but as a blunder in foresight by management. Improved training of end users has been shown to significantly reduce system failures and improve the integrity of systems.
Complexity and Consequence
Harry Armen, a past president and Honorary Member of ASME, retired as chief technologist for the eastern region of the integrated systems sector of the Northrop Grumman Corp.
Shannon Flumerfelt is an endowed professor of Lean and director of Lean Thinking for Schools at The Pawley Lean Institute at Oakland University in Rochester, Mich.
Gary P. Halada is an associate professor in materials science and engineering at Stony Brook University in New York.
Franz-Josef Kahlen is associate professor in the Department of Mechanical Engineering at the University of Cape Town, South Africa.
Armen, H., Flumerfelt, S., Halada, G. P., and Kahlen, F. (December 1, 2011). "Complexity and Consequence." ASME. Mechanical Engineering. December 2011; 133(12): 46–49. https://doi.org/10.1115/1.2011-DEC-6
Download citation file: