It’s All About Performance

In nuclear, we draw on lessons learned from many industries. When it comes to safety, the automobile industry is fertile ground for learning what works. In 1998, all new cars in the U.S. were required to have both shoulder harness seat belts and front collision air bag systems. The development of this requirement and subsequent evolutions in technology, such as the addition of side impact air bags, is an excellent case in point for understanding the importance of human performance.

The reality is that people are fallible.  We all make mistakes and suffer cognitive biases that interfere with our ability to make good decisions. 

Since Ralph Nader’s early work on car safety, we have known that driving is risky behavior and so we need to put safety measures in place. The seat belt was one of the most successful safety devices developed, however people don’t like the feeling of being restrained. Enter the air bag: A hidden safety feature with a passive design that takes human decision-making out of the equation. So even if we are silly enough to think we are immune from the perils of traffic, the car will automatically protect us by deploying an airbag.

Human fallibility

Experts in human cognition accept that we are all fallible and that given the chance to act, we often act in ways that put ourselves in perilous conditions. We mean well, but because of how the mind works, we are prone to short-cuts and simplifications to lessen the burden on the brain. The result is that we get very efficient, but in doing so, we set ourselves up for failure. We will choose to opt out of wearing a seatbelt at the time we most need it.

Human performance attempts to take advantage of these facts about people nature to design processes that help us not make errors. Just like adding air bags introduces a passive system in automobiles that clearly saves lives, we introduce systems and practices to help workers make good decisions and prevent errors. Human performance professionals work with teams to understand their work, identify potential risks and then develop ways to “add airbags” into systems to protect people from harm.


We will never be able to prevent all errors, so when error-likely conditions are identified they are evaluated and prioritized. Those conditions with the highest likelihood for injury or harm are then analyzed and safety barriers are put in place. The preventive measures don’t have to be elaborate. Sometimes it is simply a matter of a self-check (STAR: Stop, Think, Act, Review), peer check or coaching.  Regardless, experts use the SAFER model to analyze and help prevent errors:

  • Summarize critical steps
  • Anticipate error-likely situations
  • Foresee consequences
  • Evaluate barriers/defenses
  • Review operating experience

We also know, based on experience and research, that certain error traps are more likely (or are more fundamental to human nature). The biggest ones are time pressure, distractions, overconfidence/complacency, poor communication and being fit for duty. How many times have you driven recklessly because you were late, allowed distractions from passengers or your PDA to divert your focus from the road, or driven when you were not really in the right frame of mind or physical condition?

In the nuclear industry we work to cultivate an awareness that anyone can make a mistake, and when those mistakes happen we work to understand how they happened, learn from them and continue to practice error-prevention tactics. When we can find ways to add in passive systems, similar to adding air bags in cars, we do. But much of our work is skill-based,  so we rely on the best decision-making of highly committed personnel who take seriously their role in the safe operation of our plants.