Swiss Cheese and Our Healthcare
The graphic of the Swiss Cheese Model (attached is an expression from AHRQ) is a good one and one that many will remember and relate to.
However, I would like to expand on that model and express that more commonly, there is a not a singular or linear path to failure. There are typically multiple paths of failure that converge together at some point in time to cause an undesirable outcome.
The reality is that we are human and therefore prone to error. Recognizing that, we will have ‘Swiss Cheese’ with holes, as opposed to solid ‘American Cheese’:-). No slice of cheese will be failsafe. The best we can do is:
1. to reduce the number of holes in the cheese (number of vulnerabilities),
2. minimize the diameter of the holes that remain (reduce magnitude of consequence) and
3. ensure the holes that remain, do not line up to provide a pathway to failure
Our ‘systems’ are intended to help accomplish this but they are created by humans, so flaws exist. Ultimately we are responsible for our own decisions and actions, even if well-intended with a bad outcome. I refer to this as ‘consequential thinking’. If we were all well-versed in the principles of proactive RCA, human factors engineering, reliability engineering, human performance and the like, we would be much more aware of the potential consequences of our decisions. When this occurs, the entire work environment is safer. This is because we recognize the potential flaws in our system and don’t hang our hat on the fact that we followed an inadequate or insufficient procedure (or non-existent in many cases) to protect ourselves legally. Too often procedures are written to protect ourselves legally just to say we had them in place and could pass a regulatory audit. Often, they are only enforced when something goes wrong, and then we discipline people for not following them (hypocritical). When obsolete procedures are in place, the workforce knows this but they do not have the time or energy to cut through the red tape to get it fixed. This is when the ‘workarounds’ pop up. We should ask ourselves why there is a need for a workaround, when we see one?
Unfortunately, compliance does not necessarily equate to patient safety.
To me, the solution is providing all our people education in understanding the human decision making process and its contribution to failure. We should teach people to proact (identify unacceptable risk) instead of react (respond to consequences) and support that behavior instead of demonizing it with a paradigm that ‘failure is inevitable, the best we can do is respond faster’. What patient wants to hear that!
Secondly, doctors should willingly and actively participate on in-depth RCA’s. True RCA is all about understanding the human decision. Why did the person making the bad decision, think it was the right decision at the time? When we accurately answer these questions, we are doing RCA. If we indiscriminately blame people for bad decisions and levy discipline, we effectively cannot do RCA (we will not know why they made the decision they did). Without the doctor’s input as to their reasoning, we cannot under the resulting sequence of consequences. Oftentimes doctors see themselves above the activity of an RCA and do not elect to participate. I will add caveat in defense of some doctors, that what some organizations call RCA is a farce and merely an attempt to go through the motions to pass regulatory scrutiny. If doctors have participated on such RCA teams, I don’t blame them for not returning.
The attached image is an example of this. ‘Poor Communication’ is not an adequate root cause because it is not actionable. We need to drill further down to understand what about the communication system was poor and why people were not following systems in place designed to aid proper communication (poor systems that were followed, or good systems that were not).
Stepping off my soapbox:-).