CASE 2021-2: Systems Problem
An urgent caesarean section presented to the OR when one of the narcotic dispensing drawers was unable to be closed. This disabled the entire cart, which was a combined automated medication dispensing system and equipment storage device. The manufacturing company made the recommendation to have emergency medications attached to the outside of these machines for cart failures. It was decided by the pharmacy department to remove these medication boxes as a cost-savings measure. We activated the emergency procedure, which was to call pharmacy to bring up the keys to override the system and unlock the drawer. However, the keys unlocked a part of the cart, and not the medication drawers. There were additional steps to override the drawer locks, which the pharmacy staff were not familiar with, and it took additional time to properly unlock the medication drawers. While this was occurring, the patient's status became more unstable. The difficulty with unlocking the drawers delayed the start of the C-section. The infant was admitted to the NICU for several days.
What can we learn from this?
It is currently in vogue in health care to say that we analyze accidents with a systems approach. We no longer blame the people involved, but rather look at the design of the system. But how exactly do we do that? It is helpful to develop a systematic way of understanding accidents based in systems theory.
Systems theory assumes that the systems we work in are composed of hierarchical layers that each control and constrain the layers below them (Engineering a Safer World: Systems Thinking Applied to Safety. 2012). When these constraints are inadequate, the system is unsafe and accidents occur. This differs from other models such as James Reason's swiss cheese model, probabilistic risk analysis, fault tree analysis, and failure modes and effects analysis, which assume that accidents occur in a clear linear chain of events where each actor in the system is fully independent. These assumptions break down in our complex systems today where interactions are non-linear, cause and effect are not always clear, and factors such as production pressure erode all layers of defense. Systems theory does not assume linear causality and so can help us better understand these complex systems. According to systems theory, each layer in the system is a controller, and it acts to control the process underneath it by issuing control actions. A simple example is blood pressure control in the OR (Figure). The patient's blood pressure is low, as shown to the anesthesiologist by the arterial line. This updates the anesthesiologist's process model or understanding of the state of the system being controlled. Based on this information, the controller (anesthesiologist) completes a control action (e.g., administration of I.V. phenylephrine). The phenylephrine and the I.V. line in this case form the actuator of the control loop. Finally, this control action reaches the patient, completing the control loop. To understand an accident, we need to know what feedback each controller had about the process they were controlling, what actions they took to control the process, and how those actions reached that controlled process. Let's look at some of the controllers in the system described in this case. Specifically, let's consider the anesthesiologist, the medical device manufacturer, and the pharmacist, who are all key controllers in this system.
The anesthesiologist's control action was to induce anesthesia to allow the C-section to get started. They were unable to provide anesthesia because they had no access to the medications due to the medication cart malfunctioning. This is an actuator failure. They may also have had an incomplete mental model, thinking that the cart was the only place they could get medication when in reality there might have been another cart available or another OR to get medication from. If this was the case, then this information needs to be more widely shared with the anesthesiologists in the practice.
It is easy to stop here and say this was an actuator failure, a problem with the cart. But that may be laying the blame at the medical device company without truly understanding the system problems. The medical device company's control action is to design a safe medication cart. In understanding the accident, we need to understand the context of their design decision. What was their process model, their understanding of the medication cart design? Their design decisions are made in an environment of pressures from regulators, such as government regulations regarding the control of scheduled medications. There are tradeoffs between easy access to emergency medications and control of scheduled substances. Additionally, there are financial pressures from the health systems buying the product, and we need to understand the feedback they are receiving from the users about their cart design. All of these factors weigh into the final design of the product.
A third controller in this case was the pharmacist. The pharmacist's control action was to access the medication cart in an emergency when it was locked, which did not successfully happen in this case. There is a constant tension in safety between work as imagined and work as it actually exists (Int J Qual Health Care 2015;27:418-20). The policy as imagined was that the pharmacist could override the cart in the event of emergency. In reality, a difficult design made that challenging. We do not know from this report, but it is possible that the pharmacist was new or lacked training in this cart. Additionally, cart malfunctions are not common occurrences, so even an experienced pharmacist might have struggled with this task in a time-pressured setting. These are all examples of problems with a controller's control algorithm, or the process that they would use to carry out a control action.
This systems theory approach to understanding how the accident occurred creates a rich collection of areas to target for changes to prevent this accident from occurring again. In this case, possible changes target a range of roles from the front line workers up to the highest levels of government regulators. At the front line, one option is to reinstate the extra emergency box set up in each OR, as had been implemented previously, to allow for rapid emergent access to medications by the anesthesiologist. For the pharmacist controller, a cognitive aid can be created to ensure that everyone can easily and quickly access the locked cart. Cognitive aids have been shown to increase adherence to protocols in crisis settings compared to relying solely on memory (N Engl J Med 2013;368:246-53; Anesthesiology 2017;127:384-92). Alternatively, an emergency box containing all the medications required for a stat general anesthetic could be assembled and stored centrally in the pharmacy and brought to the OR by the pharmacist in the case of a similar occurrence to avoid the need for accessing the locked drawers. This is a quick and easy fix that can be implemented at the local level while broader system changes are explored. Additionally, hospital management might purchase different carts or put pressure on the device manufacturer to change the cart design. Feedback should be given to the device manufacturer and the governmental regulators about the incident to strengthen the mental models of those controllers as they make future regulatory and design decisions. Governmental regulators might also discuss changes to regulations to allow for more emergent access to controlled medications in the perioperative setting. The risks of availability of controlled medications need to be balanced with the need to access these medications in emergent situations. Different regulations might prevent this type of accident in the future across multiple health systems. This incident did not just occur because of the design of the cart or the role of the pharmacist or anesthesiologist, so changes should not be limited to the front line of the system.
Overall, seeing the incident as a systems problem allows us to understand the motivations of a wide variety of actors in the incident and describe how they interact through their control actions and their feedback. Understanding what drove the behaviors of each layer of the system allows us to explore multiple areas of change to prevent this accident from occurring again in the future. The summary above is not a complete analysis – for example, we do not have information on the role of hospital management or the user interface of the cart. But taking this approach can help drive accident investigations to gain a more complete understanding of your systems and the levers you have available to implement change not just at the frontlines of patient care, but throughout the larger system.
Special thanks to Aubrey Samost-Williams, MD, for her contributions to this article.
Review of unusual patient care experiences is a cornerstone of medical education. Each month, the AQI-AIRS Steering Committee abstracts a patient history submitted to the Anesthesia Incident Reporting System (AIRS) and authors a discussion of the safety and human factors challenges involved. Real-life case histories often include multiple clinical decisions, only some of which can be discussed in the space available. Absence of commentary should not be construed as agreement with the clinical decisions described. Feedback regarding this article can be sent by email to firstname.lastname@example.org. Report incidents or download the AIRS mobile app at www.aqiairs.org.