"human error wasn't responsible for these accidents. The problem [was] embedded within the technological systems." #readingToday
Perrow soon learned that similar mistakes had occurred during the operation of other nuclear power plants. At a reactor in Virginia, a worker cleaning the floor got his shirt caught on the handle of a circuit breaker on the wall. He pulled the shirt off it, tripped the circuit breaker, and shut down the reactor for four days. A lightbulb slipped out of the hand of a worker at a reactor in California. The bulb hit the control panel, caused a short circuit, turned off sensors, and made the temperature of the core change so rapidly that a meltdown could have occurred. After studying a wide range of "trivial events in nontrivial systems," Perrow concluded that human error wasn't responsible for these accidents. The real problem lay deeply embedded within the technological systems, and it was impossible to solve: "Our ability to organize does not match the inherent hazards of some of our organized activities." What appeared to be the rare exception, an anomaly, a one-in-a-million accident, was actually to be expected. It was normal.
Command and Control: Nuclear Weapons, the Damascus Incident, and the Illusion of Safety by Eric Schlosser