This article reviews risk analysis that involves the estimation or calculation of possible failures and then the possible consequences of those failures. Some tools involve the study of probable results of combined abnormal environments, such as fire with crush and puncture at a time when the item of concern is in a very cold environment. The record of past experiences can be both a look back at a disaster and a look forward to managing risk. With a necessarily strict approach to risk assessment as it is applied to nuclear armament, the final matrix is that of surety: safety, reliability, security, and control of human factors and access. Engineering advances will always result from dreams and aspirations of practitioners in the field. If no one attempts the unknown, the store of knowledge and experience will forever be capped at present levels.
When a switch is pulled and nothing happens, the company gets the blame. Then the company may turn around and blame the engineer.
But nothing really constructive happens until engineers start to analyze the causes of a failure. Their investigation isn't an exercise in proving blameworthiness. Failure analysis keeps accidents from repeating themselves, and often leads, to advances in technology.
Two of the most widely publicized failure analyses were the investigations into the Columbia and Challenger Space Shuttle disasters. The Challenger investigation revealed that engineers at Thiokol Corp. clearly told their company management and NASA not to "fly the bird" in cold weather because the fuel tank gaskets could fail. With a long and complex path through upper management at both the company and the agency, the warning was diluted and then ultimately ignored, with an appalling result. Years later, the Columbia failed not because the analysis of the first shuttle loss was ignored, but rather because a new set of complex actions and reactions were underestimated.
Engineering failures, such as a water main rupture or power outage, may not cost lives or even cause serious bodily harm, but they all have costs—variously in losses of time, money, services, or sleep. Failure analysis is the reason that trucks can travel over graceful-looking bridges, that water comes regularly out of a kitchen tap, and that we are all safer on highways today than we were 30 years ago.
According to Henry Petroski, who is the A.S. Vesic Professor of Civil Engineering and also a professor of history at Duke University, the history of engineering is clearly one of success and failure. "The failures may be the more useful component of the mix," he said. He is a proponent of history as a vital element in the education of engineers because of the importance of looking back to see the development of a concept and how it grew.
Petroski came to Duke from Argonne National Laboratory, where he had worked on problems caused by cracks in pipes, pressure vessels, and other steel structures. He grew to understand the behavior of materials and results that occur when cracks widen, leading to benign leaks and occasional catastrophes. He continued the work at Duke, where he delved into a wide range of literature on failure and fracture mechanics.
"Signal successes in engineering have tended to arise not out of a steady and incremental accumulation of successful experience, but rather in reaction to the failures of the past—from the minor annoyances accompanying existing artifacts to the shock of realization that the state of the art was seriously wanting," Petroski said.
In 1878, a bridge was built crossing the Firth of Tay in Scotland. Constructed of 85 spans of wrought iron lattice girders, the bridge was designed for maximum water clearance with a minimum of railroad track elevation. The bridge failed when an express train traveled over it on Dec. 28, 1879.
A Court of Inquiry found that the design, construction, and maintenance of the bridge were poor. In addition, the force of wind in the area was not taken into consideration and the bridge failed during a gale. When the Firth of Forth bridge was designed, its builders took all of those considerations to heart and the bridge continues to serve the region as it has for more than a hundred years.
Petroski believes stories of good design processes are never obsolete. Rather, he sees them as valuable windows on the nature of engineering for both good and bad-what has worked and what has failed.
John Andersen, an ASME Fellow, operates A Flight Tech in Edgewood, N.M. His firm's services include investigation into aircraft crashes.
The world's first failure analysis of a commercial passenger jet learned why the de Havilland Comet kept coming apart in flight. By pressurizing the fuselage in a tank of water, investigators found a fatal flaw in the rivet patterning around window openings. Their work led to the discovery of what is termed "metal fatigue," caused by the stresses of repeated pressurization and depressurization of the cabin.
Andersen pointed out that the Boeing 707 trailed the Comet in development and benefited by coming later. It became the world's first successful passenger jet and, he said, "The plane is still in heavy use worldwide."
Occasionally, investigators can be faced with human reactions that are outside the parameters of the study but influence an investigation. Andersen was a part of the team that looked for answers after the crash of TWA Flight 800 in 1996-a widely witnessed event. A rumor aired publicly that the plane may have been sabotaged, shot by a missile fired from the ground.
According to Andersen, the speculation greatly inhibited the investigation. He said the idea was promoted by an FBI official who refused to heed the advice of a phalanx of engineering experts, including the bureau's own. After thousands of hours studying eyewitness testimony and forensic evidence to reconstruct the event, investigators ruled out terrorist action and focused on a likelier cause-the explosion of the center fuel tank.
Art Dickerson, professor emeritus at California Polytechnic State University, recalls that he was asked to find the cause of an aircraft system failure of another sort. Military pilots flying B-50s in Korea were experiencing repeated false radar returns at a specific point in their flight path. The false alarms occurred over an area that included an ancient burial ground that contained the grave of a revered elder, whom the local people called "Papasan." Local rumor held that he was disturbed by the noise of aircraft.
The problem was so persistent that pilots had begun to wonder if the local legend had some validity. Dickerson quizzed them on the mission profile at the critical point and found it to be at the transition from a high-altitude cruise to a mid-altitude bombing run. As an engineer with an avid interest in meteorology, he knew the area was one of high humidity. He also realized that the sudden rise in temperature and humidity as the planes came closer to the ground had an effect on their cold equipment.
When the pilots were asked if they experienced fog inside the aircraft at the time the radar malfunctioned, the answer was, "almost always." Dickerson found that the radar was not pressurized and was subject to any condensation that formed inside the plane.
The designer had provided an oscilloscope photo in the manual, which showed an excessive pulse voltage between close-spaced connections on the master pulse generator. The pulse generator sets the tinling of the energy pulse, whose echo provides distance measurement to the target.
The excessive pulses caused an electrical breakdown and the generation of a false signal. The solution to the problem was a dab of silicone grease on the vulnerable spot, to neutralize the effect of the moisture. Papasan, the elder, was able to go back to sleep.
John Andersen was, for many years, part of the nation's nuclear arms protection team at Sandia National Laboratories. In that line of work, failure analysis is not an acceptable option. Instead, investigators engage in risk analysis, in which they must anticipate hazards to prevent accidents rather than investigate accidents after they happen.
According to Andersen, "Risk analysis involves the estimate or calculation of possible failures and then the possible consequences of those failures."
He goes on to point out that some tools involve the study of probable results of combined abnormal environments, such as fire with crush and puncture at a time when the item of concern is in a very cold environment. The record of past experience can be both a look back at a disaster and a look forward to managing risk. With a necessarily strict approach to risk assessment as it applied to nuclear armament, Andersen feels the final matrix is that of surety: safety, reliability, security, and control of human factors and access.
Aircraft manufacturers will build the first model of a new aircraft to fly. The second is pushed to destruction with static and fatigue testing to see how much it can with stand. If those results are far beyond any expected in flying, the craft passes the tests. The numbers of tests done and of aircraft destroyed will vary from plane to plane, depending on the type or departures in design.
Engineering advances will always result from dreams and aspirations of practitioners in the field. If no one attempts the unknown, the store of knowledge and experience will forever be capped at present levels.
However, success and failure are two sides of a creative coin. Progress will always entail some risk, even when dealing with known elements, especially when they are applied to new systems and stresses. Until all the laws of physics and their interactions are fully written out and available to engineers, there will be a need to know what went wrong-and how to make the world better.
"Success and failure are two sides of a creative coin. Progress will always entail some risk."