“[P]atient safety focuses on understanding avoidable threats to patients due to individual and systems failures and then to create or improve systems that will respond resiliently to nonroutine operating conditions.”
HEALTH care in general, and anesthesia in particular, continues to strive to achieve higher levels of patient safety despite increasing pressures to improve throughput while decreasing costs. In this article, we review several approaches to improve safety that have proven to be helpful in other challenging fields of human endeavor.
General Principles for Advancing Healthcare Safety and Quality
Patient safety refers to “the avoidance, prevention, and amelioration of adverse outcomes or injuries stemming from the processes of health care.”* This definition of safety is analogous to operational safety in other high-risk industries.1 However, patient safety is different from healthcare worker safety (such as preventing needle sticks and back injuries) although there are increasing calls for a greater emphasis on occupational safety in healthcare organizations (which have some of the highest rates of worker injuries of any U.S. industry) as a critical first step in establishing a more patient-centric organizational culture whereby valued employees are more likely to treat patients with dignity and respect.
Although they are complementary, healthcare quality is not the same as patient safety. For example, quality initiatives are typically focused on improving reliability and/or efficiency, decreasing variability, and achieving consistent outcomes during routine operating conditions. In contrast, patient safety focuses on understanding avoidable threats to patients due to individual and systems failures and then to create or improve systems that will respond resiliently to nonroutine operating conditions.
There is an inexorable asymmetry in the information available about routine work and productivity versus that available about safety.2 Information about quality issues and throughput is often plentiful and easy to measure and interpret. Information about “safety” is scarce (even near-misses are uncommon), hard to measure, and often ambiguous. How does one measure an accident that did not occur? Thus, although both quality and safety improvement make use of organizational learning from the study of deviations from desired processes and outcomes, it is inherently far more difficult to do this to improve safety. Perhaps for this fundamental reason, there can be a disconnect between those working on quality and those working on safety. Although there are many overlaps, the two groups often use different conceptual models, tools, and approaches, and they can sometimes have competing goals.3
Systems View of Safety
Adverse perioperative events are due to systemic factors over which individual clinicians have little control, such as dysfunctional organizational structure, faulty institutional communication pathways, or poorly designed technology. Some suggest that discontinuities (or “gaps”) in care processes are the major cause of most safety and quality deficiencies. This “gap theory” is a refinement of long-held concepts about the etiology of complex system failures espoused by Reason, Rasmussen, Perrow, and others.2,4,5 Clinicians on the front lines of care delivery end up being responsible for bridging these gaps to attain safe, high-quality care. Thus, in contrast to conventional views, anesthesiologists and other clinicians actually create safety while working in inherently unsafe systems. In doing so, however, the workarounds they create for individual patients or situations can mask the embedded systems problems and leave them unsolved.6 Adverse events then are a consequence of these embedded system failure modes that create unsafe conditions but go undetected until an unfortunate sequence of events leads to patient harm. If this is correct, the still too common process of blaming clinicians who make errors in the course of trying to do their best in a dysfunctional system is unlikely to reduce the occurrence of future adverse events. Moreover, such a culture is likely to be counter-productive for organizational morale, teamwork, and willingness to report future events.
Focus on Front-line Work
A key focus of virtually all safety (and quality) initiatives should be on the people who actually do the front-line work. These workers, and their immediate supervisors, must “own” meaningful interventions. However, these individuals cannot be expected to be successful without the support of higher managers and executives. The best approach to solving systems problems may be to empower and properly resource an interdisciplinary team of knowledgeable front-line clinicians and appropriate experts (e.g., human-factor engineers or psychologists, industrial engineers, informaticians, quality/safety experts, and educators). In the editorial of this series by Pronovost et al.,7 they described ways in which such interdisciplinary teams can make a difference in healthcare quality and safety.
A critical aspect of harnessing the knowledge of front-line workers is the ability of executive leadership to receive and make sense of information about both operational success and failure. This requires a culture that encourages the reporting of adverse events and near-misses by those on the front-line as well as an environment and processes that encourage honest appraisal of these reports all the way up the chain-of-command. In practice, this means adverse event reporting systems that: (1) make it easy to report, (2) do not punish the reporter or those “responsible,” (3) respond rapidly to important events, and (4) provide feedback to the front-line that shows that leadership takes their input seriously and wants to improve. Such systems are still all-too-rare in most healthcare facilities although recently created national-level reporting systems (e.g., the Anesthesia Incident Reporting System of the Anesthesia Quality Institute) and registries (e.g., WakeUpSafe—the perioperative event registry of the Society for Pediatric Anesthesia) are an encouraging development.
Culture and Leadership
The management science in support of a critical role for organizational culture in operational quality, reliability, and safety is undeniable. Unfortunately, few U.S. healthcare organizations have figured out how to create a sustained culture of quality and safety. Culture change starts with organizational leadership, setting priorities, and demonstrating them through visible action. Safety needs to be an organizational imperative equal to the delivery of quality care and to meeting financial targets. Leaders must articulate that safety is not just a byproduct of everyone doing their job well, it can only be attained if processes and systems are specifically aimed to achieve it. Moreover, such clear vision and values must be disseminated throughout the organization, not just by “lip service” but modeled by leadership, and reinforced through its decisions. Leaders must be held accountable for creating mechanisms that achieve safety while maintaining quality patient-centered care with high throughput.
The Role of Human Factors Engineering in Patient Safety
We expect new therapies to be documented as cost-effective in controlled studies, but they must then be disseminated so that they are reliably used by diverse users in myriad settings. Many healthcare interventions do not attain their expected benefits because of deficient “human factors”—attributes that diminish the ability of humans to perform the necessary steps to succeed consistently in the complexity of real-world settings.8 For example, when new interventions change the normal or expected clinical workflow, this can impose higher workload and competing priorities, encourage workarounds, and thus yield unintended consequences. Human Factors Engineering (HFE) is the application of knowledge about human characteristics, capabilities (physical, emotional, and intellectual), and limitations to the design and implementation of tools, devices, processes, and systems.9 Table 1 provides examples of how HFE principles and methods have been applied to anesthesia patient–safety issues.
We prefer the term HFE to simply Human Factors, even though nonengineers (e.g., psychologists, architects, cognitive scientists, informaticians, and appropriately schooled clinicians) can also skillfully engage in the discipline. This is because when the term “human factors” is used, it is still all too common to hear clinical operational leaders and managers agree wholeheartedly about the importance of applying human factors because, “if we can only figure out how to get the humans (i.e., front-line clinician) to stop making errors, violating policies, using the computer systems improperly, etc., then our system would be much safer and more reliable.”
But, HFE is precisely the opposite. A core philosophy of HFE is that processes and technology should be designed and implemented to fit the “real-world” needs of users, rather than the needs “imagined” by those far removed from it. This view is supported by the 2009 report of the National Research Council, Computational Technology for Effective Health Care. Optimal technology implementation requires (1) understanding the complex interacting factors affecting care, (2) measuring critical process and outcome variables, (3) design of robust, locally focused interventions, (4) measuring progress, and (5) repeating these steps on multiple levels.
For example, clinicians often complain that healthcare processes, policies, and devices are designed without adequately understanding the needs, demands, and realities of the real-world clinical setting. User-centered design (fig. 1) strives to thoroughly account for these contextual imperatives by incorporating structured end-user input into all stages of technology design and deployment.10,11
User-centered design allows iterative design, prototyping, and evaluation of a solution at all levels of the individual process, the device, and the work unit. If appropriate, these design–evaluation–redesign cycles can be performed in simulations (computer-based modeling and/or clinical simulations of increasing fidelity) under conditions of plausible failure as well as in routine conditions. Once a solution is refined, it can be tested in small pilot trials in a particular care unit with careful evaluation. Evaluations should consider the contextual factors related to success or failure and seek generalizability across different units, conditions of use, and user populations. Formal assessment should continue into the postdeployment phase. The user-centered design approach can be scaled according to the magnitude and type of problem as well as to the time and resources available. Experience shows that although such techniques may impose additional up-front costs, this is more than made up for by decreased: (1) difficulties and expense of initial implementation, (2) time to full adoption, (3) number of workarounds required; (4) rework or other mitigations for design failures; and (5) harm from use errors. Anyone familiar with quality-improvement methods (Six Sigma, Lean, Kaizen, etc.) will recognize similarities of user-centered design to such iterative process-improvement methods such as Plan-Do-Study-Act (or often called “PDSA”) or Define, Measure, Analyze, Improve, and Control (or “DMAIC”).
Using the panoply of HFE and systems redesign approaches, every perioperative safety infrastructure should seek to be: (1) highly responsive to solving (not just working around) local operational safety problems, (2) viewed as a leader in and facilitator of improved safety across the institution, (3) an accessible resource for all organizational stakeholders, and (4) able to bring to bear local and external resources to solve the most important perioperative safety problems.
The Role of Teamwork in Patient Safety
Every complex human endeavor requires establishing teamwork, enriching and sustaining it over time, and refining approaches to it based on organizational outcomes. Teamwork can be defined as the process by which a group of individuals work together to accomplish specified goals. Health care contains a variety of different kinds of teams, most of them substantially interdisciplinary. Thus, achieving effective perioperative teamwork requires a deliberate multidisciplinary approach. Some teams work in close physical proximity (e.g., an operating room team), whereas others work in a distributed manner, coordinating effort through various means of communication. Teamwork is equally important for nonclinical hospital teams, for administrative problem solving, and for creating effective infrastructures that support clinical work.
An HFE maxim is that “design trumps training.” While training personnel about teamwork, and having them practice these tenets under stressful conditions is clearly useful, it is better to create work processes, tools, and culture that organically foster and reinforce appropriate teamwork behaviors, which will be understood naturally as “the way we do things around here.”
This may explain a widespread experience of failing to achieve the desired outcome improvements from healthcare organizations’ investments in “teamwork training” or related endeavors. We believe that often this is due to choosing superficial training interventions or weak implementations. We have also learned that successes from other hazardous industries do not directly translate to health care without adaption. Health care is not “just like aviation” (or nuclear power, aircraft carriers, etc.)—it has a fundamentally different structure and organization, culture, and unique attributes that mandate a customized approach to teamwork improvement.1
Fortunately, during the last 20 yr, there is a growing literature on effective teamwork methods adapted or created specifically for health care.12,13 One tactic to enhance teamwork is to deliberately include an appropriately designed teamwork or interdisciplinary communication component in every perioperative quality-improvement initiative so as to steadily inculcate the necessary teamwork skills, behaviors, and beliefs in all perioperative personnel over time. Fully engaging surgeons is critical to this approach.
The Role of Simulation in Patient Safety
For over 70 yr, one of the most important HFE tools for addressing quality and safety in dynamic settings of high intrinsic hazard has been simulation. The ability to reasonably realistically recreate most aspects of clinical situations is a powerful technique that can be applied in many different ways. Education and training are the most common applications,14 but simulation is also used for performance assessment, pilot testing new clinical processes and safety interventions, assessing safety and usability of new technologies, and for understanding why adverse events occur. Thus, simulation is a unique and powerful methodology to be wielded by clinicians and experts in HFE or education. It is the hard work and wisdom of the people, not the technology or technique itself, that permits simulation to help advance patient safety.
The Role of HFE in Quality Improvement
Although the focus of our comments is on patient safety, we should also mention the important role of HFE in quality improvement. As defined by the Institute of Medicine in its 2001 report, Crossing the Quality Chasm, high-quality care not only assures patient safety but also care that is effective, efficient, timely, equitable, and patient-centered. When these quality attributes are systematically addressed and improved through HFE, yielding the desired changes in organizational culture, training, processes, and systematic learning, patient safety will usually also improve as well.
One example of how HFE can contribute to improving other attributes of care quality, properly applied HFE tools and methods can decrease process variation and thus improve system reliability. A characteristic of high-reliability organizations15 is to standardize wherever possible, while remaining flexible and open to change as individual circumstances require, or when new evidence becomes available.1 This balance is especially necessary in health care where the variability of patients and their clinical situations is enormously high in comparison with that of other industries. In high-reliability organizations, decision making devolves to those with the best information, hence giving overt permission to front-line teams to vary from the standard when they believe that a specific situation requires such deviation to maintain safety or process quality. They are also empowered to come up with better ways of doing things. They are accountable—they must justify their deviation from the standard. But, in the case where their deviation is shown by empirical evidence to be superior, it can become the new standard.
We have delineated some of the most important concepts and approaches to improve patient safety in health care with an emphasis on the care done or influenced by anesthesiologists. The new-world order of healthcare calls for a focus on maximizing population health while limiting costs. We find the notion repugnant that a cost of improving the health of the many will be unnecessary and preventable harms, even death, for a few. The vision statement of the Anesthesia Patient Safety Foundation is “That no patient shall be harmed by anesthesia [care].” We believe that the successful application of HFE can significantly improve patient safety and care quality while also lowering healthcare costs. Finally, because anesthesiologists are already trained to be “systems thinkers” and problem solvers, with additional training in HFE, they can be uniquely positioned to continue to lead in patient safety, not just in the perioperative arena, but throughout our dysfunctional healthcare delivery system.
Dr. Weinger’s effort was supported by the Department of Veterans Affairs Tennessee Valley Healthcare Systems’ Geriatric Research Education and Clinical Center (Nashville, Tennessee), by the Veterans Administration Health Services Research and Development (HSRD AF-06-085, Washington, D.C.), by the Agency for Healthcare Research and Quality (R18-HS020415, Rockville, Maryland), and by institutional resources of the Vanderbilt University School of Medicine (Nashville, Tennessee). Dr. Gaba’s effort was supported by the Department of Veterans Affairs Palo Alto Medical Center (Palo Alto, California) and by the Stanford University School of Medicine (Stanford, California).
Both authors are members of the Executive Committee of the Anesthesia Patient Safety Foundation (Indianapolis, Indiana) and have previously received or are currently receiving grant funding from the foundation for patient safety–related research.
Cooper JB, Gaba DM, Liang B, Woods D, Blum LN: National Patient Safety Foundation agenda for research and development in patient safety. Available at: http://www.npsf.org/wp-content/uploads/2011/10/Agenda_for_RD_in_Patient_Safety.pdf. Accessed November 16, 2013.