Human cognition, bias and error have been studied significantly over the past few decades and are utilized in several fields, including reliability and safety engineering. Research has indicated that both man-machine interfaces and training are critical during human intervention. Additionally, it has been shown that humans contribute significantly to failures, and thus downtime. This trend is likely to continue as systems become more complex. Several methods, such as Human Reliability Assessment (HRA) and Probabilistic Risk Assessment (PRA), had been proposed and utilized throughout industry. These methods are both qualitative and quantitative and aim to understand, and thus improve, human performance within the system. Additionally, much of the research is focused on risk reduction — for example, design of a power plant to maximize redundancy in human performance during a mishap. Human error is a complicated process in itself and closely tied to cognition, information processing, system automation, team dynamics and biases inherent to humans. It cannot be eliminated by training and familiarity alone, and system design plays a major role in susceptibility to error. The digital age has spurred many advances in processing power, sensor technology and data capture. These advances have resulted in situations where a very large amount of data can be captured and presented to the user. The large amount of information has to be processed with limited attention resources, which can result in human error. This contribution will discuss human error and information processing along with the role of humans in modern power plants. Finally, trends in information overload will be discussed with applications to reducing human error in power plants.

This content is only available via PDF.
You do not currently have access to this content.