Engineers of complex technology should be studying the
The article traces the nuclear “safety myth” to
This mindset led to a rash of irrational, if very familiar, bad decisions. The (Japanese!) utility did not invest in radiation-capable robots. At
A misleading PR offensive is one thing, but to willfully neglect to amass safety technology and tools that will be needed in the event of an emergency strikes me as a new page in the history of engineering disasters.
The “Safety Myth”, like all myths, was accepted on pure faith. Therefore, any emergency contingency plan directly challenged that faith. Since the faith stood on a shaky foundation to being with, everyone who clung to it was overly defensive about it. Any thought of the nuclear plans not being “absolutely safe” needed to be ignored at all cost.
I can relate. I used to feel the same way about my old Subaru Legacy, especially after she passed the 225,000 mile mark. What’s that grinding sound? What’s that odd smell? Nothing, absolutely nothing!
The problem is that the accident always hits eventually. With complex interactions between components that are tightly coupled (see Figure 9.1, Normal Accidents, Charles Perrow, or below) you will have an accident within the system. It is mathematically guaranteed to happen. The accident might not blow the plant to kingdom kum, but there is no guarantee that it won’t either.
Engineers are familiar—or should be—with the idea that redundant safety measures don’t necessarily make the system safer. They add more layers of complexity, give something else an opportunity to break down, all while lulling the human operators into a false sense of security. The Japanese appeared to have skipped this step. They created their false security without paying for the safety systems. Belief is a powerful way to circumvent the laws of physics, until it’s not.