How Complex Systems Fail
https://how.complexsystems.fail/
How Complex (Complexity) Systems Fail , (Being a Short Treatise on the Nature of Failure; How Failure is Evaluated; How Failure is Attributed to Proximate Cause; and the Resulting New Understanding of Patient Safety), Richard I Cook, MD
- Complex systems are intrinsically hazardous systems.
- Complex systems are heavily and successfully defended against failure
- Catastrophe requires multiple failures â single point failures are not enough.
- Complex systems contain changing mixtures of failures latent within them.
- Complex systems run in degraded mode.
- Catastrophe is always just around the corner.
- Post-accident attribution to a âroot causeâ is fundamentally wrong.
- Hindsight biases post-accident assessments of human performance.
- Human operators have dual roles: as producers & as defenders against failure.
- All practitioner actions are gambles.
- Actions at the sharp end resolve all ambiguity.
- Human practitioners are the adaptable element of complex systems.
- Human expertise in complex systems is constantly changing
- Change introduces new forms of failure.
- Views of âcauseâ limit the effectiveness of defenses against future events.
- Safety is a characteristic of systems and not of their components
- People continuously create safety.
- Failure free operations require experience with failure.
Notes mentioning this note
There are no notes linking to this note.