2.2.16

HABIT IS WHAT GETS YOU KILLED.

That's old-head advice to new railroaders.  There's a reminiscence by a retired Grand Trunk Western engineer in the latest Trains of a foggy day in the Detroit area in which a crew, relying on radio messages from what they thought was the nearest train ahead of them, operated a little more aggressively than yard-limits-stop-in-half-the-viewing-distance rules called for, and it must have been sheer luck or divine intervention that kept them from hitting the truly nearest train ahead of them, which belonged to another railroad, and was communicating with the yard master on a different radio frequency.  Everybody went home intact, that day.

There's a social-psychology term of art, normalization of deviance, that applies to habits, particularly bad habits, getting people killed.  "In laypersons’ terms, it describes a situation in which an unacceptable practice has gone on for so long without a serious problem or disaster that this deviant practice actually becomes the accepted way of doing things."  In railroading, the most common such practice involves anticipating signal aspects, such as the approach signal ahead of your train clearing before your train gets there, or the tower operator lining up your train rather than some other train, because that's the way it's always worked before.  Until it doesn't and you're out of braking distance.

Both of the fatal crashes of space shuttles also involved disregarding the signals.
As far back as 1979 (two years before the first shuttle launch and seven years before the Challenger exploded), engineers warned of concerns with the O-rings.

The Rogers Commission that investigated the Challenger explosion highlighted the history of concerns with the O-rings that went back to 1979, and included a copy of a Morton Thiokol memo that indicated that the design would be best used for unmanned space travel. In a 1979 Morton Thiokol memo, an engineer wrote that he believed the O-ring rocket design should be used with unmanned rockets, as he was concerned about their failure. Burn-through and the resulting erosion of the O-ring had been documented on several past flights. But in the absence of an explosion prior to the Challenger launch, NASA actually came to accept the failure of the O-rings because no disaster has occurred.

The same social psychology phenomenon would rear its ugly head 17 years later at NASA. When a large piece of insulation struck the shuttle Columbia orbiter just after a 2003 launch, several NASA engineers expressed concern that a hole could have been opened in the shuttle wing. NASA management dismissed the concern by saying that insulation had fallen off on multiple prior launches without harm to the shuttle occurring. A NASA engineer pleaded with his superiors to take a picture of the orbiting shuttle, as he was concerned that the foam insulation that had hit the shuttle upon takeoff had caused serious damage to the wing. His warnings were ignored, no picture or thermal imaging was performed on the Columbia orbiter during flight, and the ship disintegrated upon re-entry.
Had the engineer's advice been taken, NASA did have an opportunity to effect a rescue (although it would involve the kind of hurried preparation more often seen in time of war) but, because the habit was to launch despite foam strikes rather than rethink the insulation on the external tanks, the rescue shuttle would also be at risk of a foam strike.

Sometimes a psychologist can do no better than channel the Grumpy Old Road Foreman.
The normalization of deviance is one of the most dangerous aspects of human nature in preventing disasters.

If an unexpected and undesirable event is taking place in your organization, investigate and understand it thoroughly.

The absence of a disaster doesn’t mean that one won’t occur. Perhaps you’ve merely “beaten the odds” up till now, but statistics will catch up with you eventually, and the result could be tragic. If you find yourself or an employee explaining away known risks by saying, “We’ve done it this way before without problems,” the organization may be succumbing to the normalization of deviance.
The Grumpy Old Road Foreman would say "Read the Book of Rules and be governed accordingly."  Unfortunately, that was not the management culture at NASA.  You've got to have a Grumpy Old Road Foreman, and the Grumpy Old Road Foreman has to say no, and you've got to listen.
Research any tragedy or disaster, and you’ll almost always find that someone knew about the problem beforehand. From the lead in Flint’s water to the levy collapses in Katrina, from Challenger to the Titanic, it’s a rare calamity indeed that truly strikes without warning. Sometimes, these failures occur because our technological abilities have outstripped our understanding. Often, they occur because we fail to follow our own best practices.

The most sobering lesson of Challenger is that Challenger wasn’t unique. The managers and engineers who ultimately signed off on the launch weren’t trying to deliberately gamble with the lives of the seven astronauts who died that January morning. It would be more comforting if they had. It’s easier to declare people evil than to sit and grapple with how organizational culture can lead to such catastrophic failures.

We all cut corners. We all make compromises. We all skip our own best practices, whether that means a full eight hours of sleep every night, or sticking to a healthy diet. We all lie to ourselves in little ways, and because the majority of us are tiny fish in a very large pond, we don’t see much in the way of consequences.

The biggest lie we tell ourselves is that bigger fish than us automatically make better decisions than we do. Challenger, Columbia, and the hundreds of tragedies large and small that have played out in the intervening thirty years are proof they don’t.
Result: broken spacecraft and dead people.

No comments: