How Complex Systems Fail

Obviously, I'm fascinated by systems.  ​Bruce Schneier recently posted a link to a very interesting white paper by Richard Cook of the University of Chicago, whose research interests include, "the study of human error, the role of technology in human expert performance, and patient safety."

Here's his list, which has some interesting parallels to my own book chapter on "System ​Design Principles" (you can watch a free video for that chapter here).

  1. Complex systems are intrinsically hazardous systems.
  2. Complex systems are heavily and successfully defended against failure.
  3. Catastrophe requires multiple failures –single point failures are not enough..
  4. Complex systems contain changing mixtures of failures latent within them.
  5. Complex systems run in degraded mode.
  6. Catastrophe is always just around the corner.
  7. Post-accident attribution accident to a ‘root cause’ is fundamentally wrong.
  8. Hindsight biases post-accident assessments of human performance.
  9. Human operators have dual roles: as producers & as defenders against failure.
  10. All practitioner actions are gambles.
  11. Actions at the sharp end resolve all ambiguity.
  12.  Human practitioners are the adaptable element of complex systems.
  13. Human expertise in complex systems is constantly changing
  14. Change introduces new forms of failure.
  15. Views of ‘cause’ limit the effectiveness of defenses against future events.
  16. Safety is a characteristic of systems and not of their components
  17. People continuously  create safety.
  18. Failure free operations require experience with failure.  

I've often felt that systems take on a life of their own once assembled, and I found his paper, which is only a few pages long, a fascinating read.  You can download it here.

Previous
Previous

Last Snow Time Lapse of The Season (I Hope)

Next
Next

Speaking at Broadway Sound Master Class