Human Errors may be Unpreventable but, Preventing Harm is an Option
“You can’t cross the sea merely by standing and staring at the water” – said Rabindranath Tagore, Nobel Prize recipient for literature. Not preventing harm is an example of this quote.
The quotation is an inspirational reminder that people achieve nothing unless they take purposeful action that has measurable results. Taking action on potential human errors, so that harm to system users is totally prevented, is a very productive goal because its impact is highly significant.
This article mostly discusses safety in hospitals because everyone is familiar with safety issues in healthcare. The principles equally apply to any process such as design, manufacturing, construction, and aerospace (Boeing 737 Max is an example).
Principles of Human Factors Engineering (HFE)
Some errors can be prevented with safeguards, barriers, and forcing functions. Most are unpreventable without such barriers in place. At Tripler Army Medical Center a newborn baby went into a coma with severe brain damage. They said medical personnel mistakenly gave him carbon dioxide immediately after birth instead of oxygen. Sources said the operating room may have been set up incorrectly [1]. This incidence is an example of the vulnerability of humans as well as the systems. Use of human factors engineering can prevent or minimize the impact of such incidences and many more. “
Human factors engineering (HFE) focuses on how people interact with tasks, technology, and the environment with the consideration that humans have limitations and capabilities. It evaluates “Human to Human,” “Human to Group,” “Human to Organizational,” and “Human to Machine (Computers)” interactions to better understand these interactions and to develop a framework for evaluation [2]. But HFE in practice goes beyond this definition. It attempts to mitigate the mishaps after the evaluation.
The most knowledgeable and experienced, such as the doctors who diagnose the illnesses, also make their share of mistakes. Fortunately, as suggested by the quality guru Deming that about 85% of the time, the system is improperly designed [3]. Therefore, improving the systems should be the focus for preventing harm. But first, we must understand the sources of human errors.
The major sources of human errors are [4]:
- Errors of substitution, turning on hot water instead of cold during a shower.
- Errors of selection, such as selecting carbon dioxide for a patient instead of oxygen as in the beginning of this article.
- Errors of reading, a nurse may read 1.0 mg as 10 mg.
- Errors of oversight and omissions, a nurse may simply forget to give an antibiotic after a surgery.
- Errors of irritation, a care giver may perform a task wrong when irritated with too many alarms and interruptions.
- Errors of warning, when warning signs are not clear or have too many steps in the instructions.
- Errors of alertness, the dangers of residents working on multiple shifts are obvious.
- Errors of interchangeability, such as connecting an oxygen hose to the nitrous oxide source in anesthesia equipment, because the fittings on both the sources are same.
- Errors of lack of understanding, an improperly trained staff is likely to make mistakes during emergencies.
- Errors of haste, a care giver unable to perform the tasks in an allocated time is likely to skip seemingly minor tasks as hand sanitization prior to surgery or a surgeon leaving a sponge inside a patient.
- Errors of sequencing, a medical technician may not perform the work in sequence of a checklist and overlook an activity.
- Errors of overconfidence, this happens in diagnosis when a physician sees a very familiar symptom. Such an incidence was the motivation for the television show Miami Medical. The producer Jeffrey Lieber’s wife went to an emergency room for a flu symptom. The doctor told her she has flu and sent her home. She soon went into coma at home. Luckily her mother was at home. She managed to stuff her in the car and took her to emergency room again. She survived the trauma unit.
- Errors of reversal, a care giver may increase the heart rate instead of decreasing not being aware whether to turn the control clockwise or anti-clockwise.
- Errors of unintentional activation, a caregiver may inadvertently flip a life support switchto OFF instead of ON.
- Errors of physical limitations. A short person may not be able to reach out to an object at an abnormal height and cause an accident after climbing on a chair.
- Errors of casual behavior, sometimes a care giver may not take a task seriously. A surgeon may not go through a scrub process when touched by another person. Or, a surgeon may not mentally prepare a patient for a surgery.
Some techniques for harm prevention
Ref [5] gives some methods such as:
- Crew Resource Management (CRM)
- Management Oversight and Risk Tree (MORT)
- Swiss Cheese Model for Error Trapping Mistake proofing
- Change analysis
- Mistake proofing
We will cover the first method, Crew Resource Management (CRM), which has been widely used in hospitals.
This technique came from the commercial aviation industry for overcoming the barriers to poor communication, a leading cause of adverse events. It has recently found a home in health care, because of its quick ability to reduce patient safety infection rates and other key measures in many hospitals by introducing checklists and other safety tools. Hospitals have found that a simple checklist, which a team can use to check each other such that no critical step is overlooked, make CRM an easy-to-use solution. The World Health Organization is using a Surgical Checklist to save millions of lives.
David Marshall, the CEO of Safer Healthcare, and author of the book Crew Resource Management, defines CRM [5] as a ‘flexible systemic method for optimizing human performance in general, and increasing safety in particular, by (1) recognizing the inherent human factors that cause errors and the reluctance to report them, (2) recognizing that in complex, high risk endeavors, teams rather than individuals are the most effective fundamental operating units, and (3) cultivating and stilling customized, sustainable and team-based tools and practices that effectively use all available resources to reduce the adverse impacts of those human factors. The Joint Commission has continuously monitored the path of CRM and has consistently reinforced its support, according to Marshall. He adds: No matter how educated or careful healthcare professionals are, errors will occur. So, the natural question to ask is: “How do we prevent those errors from ever impacting a patient?”
The essentials of CRM, according to Marshall are to provide a concrete set of skills to the teams with the following goals:
- Team Building:How to conduct briefing in 30-60 seconds with an overview of what is about to happen.
- Team Debriefing:Capture what went well, what did not go well and how can they improve for next time.
- Assertiveness:How to speak up and when to speak up if anyone sees a problem.
- Situational Awareness:Being aware of what is going on and what is about to happen and identifying red flags. If a team member falters, the team picks up the slack and brings it to the attention of the stakeholders.
- How to use critical language:Using a couple of buzz words that entire team knows. The whole team stops and pauses.
- Decision making:How to work together and make effective decisions.
The CRM methodology is the biggest gift to health care from the aviation industry. It has worked beyond anyone’s expectations in bringing the central line associated bloodstream infections almost down to zero in many hospitals. It has saved millions of dollars and hundreds of lives. As a result, health care has created the equivalent of the Commercial Aviation Safety Teams (CAST), a public/private partnership intended to reduce hazards throughout aviation. At the time of this writing, Dr. Peter Pronovost and his colleagues are exploring a health care version of CAST with an ad hoc group whose stakeholders include AHRQ, the FDA, the Joint Commission, ECRI Institute, and over 15 large health systems. They call this approach the Public Private Partnership to Promote Patient Safety (P5S) [6].
References
[1] Gosbee, J., 2004. VA National Center for Patient Safety, Human Factors Engineering and Patient Safety, Slide presentation at the Annual Conference of the Michigan Health & Safety Coalition
[2] Human Factors Engineering definition from Wikipedia
[3] Evans, J., and Lindsay, W., 2008. Managing for Quality and Performance Excellence, Thomson South-Western, Seventh Edition, 2008
[4] Raheja, D., 2011, “Safer Hospital Care”, CRC Press, Boca Raton, FL
[5] Marshall, D., 2009. Crew Resource Management: From Patient Safety to High Reliability, Safer Healthcare Partners, LLC, Denver, Colorado
[6] Pronovost et al, 2009. Reducing Health Care Hazards: Lessons from the Commercial Aviation Safety Team, HEALTH AFFAIRS, April 7
Leave a Reply