Review of unusual patient care experiences is a cornerstone of medical education. Each month, the AQI-AIRS Steering Committee abstracts a patient history submitted to the Anesthesia Incident Reporting System (AIRS) and authors a discussion of the safety and human factors challenges involved. Real-life case histories often include multiple clinical decisions, only some of which can be discussed in the space available. Absence of commentary should not be construed as agreement with the clinical decisions described. Feedback regarding this article can be sent by email to Heather Sherman: firstname.lastname@example.org. Report incidents or download the AIRS mobile app at www.aqiairs.org.
Case 2015-1: ‘My Aircraft’ – ‘Your Aircraft’ – A Look at a Standardized Communication Case
A patient undergoing a neurosurgical procedure experienced mild hypotension. The attending anesthesiologist administered 10 mg ephedrine I.V. The resident, unaware of the administration of ephedrine, administered phenylephrine 100 mcg I.V. simultaneously. This resulted in mild hypertension with a blood pressure of 150/90 (baseline 90s/50s), which resolved spontaneously with no further treatment.
On the AHRQ common formats harm scale, this would be designated a “no harm event.” This team and this patient were lucky. The case highlights the ongoing concerns with inadequate communication during perioperative care. Lingard et al.1 examined patterns of “communication failure” in the O.R. and defined four categories: The occasion category involves problems related to time and space: The right thing said to the right person but at the wrong time. Content failures describe communicative exchanges that include incomplete or inaccurate information: the wrong thing is said, perhaps to the right people at the right time. Purpose failure defines situations in which questions were asked by one team member but without a response by other members of the team, prompting repeated and increasingly urgent requests. Finally, audience failures are situations in which the right content is communicated, but a key person (often someone who had essential information) is excluded. The case above exemplifies either a content failure – (inadequate information transfer) or an occasion failure (the information had not yet been communicated). It highlights that communication is always two-way – failure during sending the message or failure upon receipt (not hearing or not understanding). Of note, the authors found that 30 percent of intraoperative communication events failed to achieve their intended aim!
The Communication-Human Information Processing (C-HIP) model is a framework for structuring the stages involved as information flows from a source to a receiver who then processes the information to produce an intended behavior (Figure 1).2 Once a message is received, the cognitive activities required are categorized into a sequence of information processing stages. The message must trigger the receiver’s attention, their attention must be maintained long enough for the information to be received, and then it must be understood. For the latter, the message must align with the receiver’s existing attitudes and beliefs (or if in disagreement, must be persuasive enough to evoke the desired response). Finally, the recipient must be able to process the message appropriately and activate the appropriate behavior. As one of our colleagues observed on a related issue, “not everything intended is said, not everything said is heard, not everything heard is understood, and not everything that is understood is acted upon” (Cynthia Shields, M.D., Chair, Department of Anesthesiology, Uniformed Services University. Personal Communication, 2015).
In a study by Oken et al.3 , communication was judged by experts to be a contributory factor in almost a quarter of all anesthesia non-routine events. A non-routine event is defined as any aspect of clinical care perceived by clinicians or observers as a deviation from optimal care for that specific patient in that clinical situation.
Situational awareness (SA) is another contributing factor.It refers to the individual or team’s ability to ascertain and then maintain the “big picture” of a clinical situation. According to the seminal work of Mica Endsley,4 situation awareness can be categorized into 3 Levels – detection of change in the situation (Level 1), diagnosis or understanding of the situation (Level 2), and prediction about how the subsequent situation will progress (Level 3). A full discussion of the importance of and impediment to situation awareness is beyond the scope of this article, but clearly the two clinicians in the case did not know what the other was doing (i.e., did not have SA), at least partly due to communication failure.
Finding ways to improve communication (and thus safety) has been the focus of significant research and changes in clinical practice. Many of these efforts have focused on team behaviors implemented at specific, high-risk moments. Leonard and colleagues delineated some of these team-based tools and behaviors that improve communication.5 Structured briefings allow the team to update and maintain situation awareness. While standard practice in aviation, nuclear power and other high-risk industries, briefings are still uncommon in health care. Structured communication tools can help to facilitate briefings and handovers.6
Debriefings are another tool used by effective teams to reflect on and improve performance. In a debriefing, the entire team reviews what it did well, what the challenges were and what could be done differently next time. In a study of team learning during the adoption of minimally invasive cardiac surgery, debriefings were seen as a key success factor in the surgical team with the quickest learning curve and best clinical outcomes.7
The case above shows that when information is omitted or misunderstood during two-person communication, no matter how brief, serious clinical consequences can result.8,9 The Joint Commission (TJC) estimates that 80 percent of medical errors involve miscommunication during care transitions. TJC currently requires hospitals to implement a standardized, interactive process to handover communications. Again, a full discussion of the importance of and barriers to effective care transitions is beyond the scope of this article, but the concept that every action or request for an action between two individuals should be considered a “micro-handover” is an important one.
The pre-procedure “Time Out,” as mandated by TJC, is likely the best-known formal team communication event designed for safety. Unfortunately, such formal, team-based communication methods will not prevent the type of patient safety event identified in the case above. We cannot have a Time Out or “briefing” before treating hypotension (although some clinicians have created structured communication processes around predictable high-risk moments such as coming off cardio-pulmonary bypass).
As is frequently the case, we can learn much about real time communication from the aviation industry. Increasingly, safety scientists are focusing on learning from success as much as from failure.10 A successful aviation case of dyadic team communication occurred during the emergency landing of flight 1549 in the Hudson River in January 2009. Captain Chesley B. “Sully” Sullenberger and First Officer Jeffrey Skiles performed an emergency ditching after striking a gaggle of Canada geese three minutes after takeoff, causing both engines to fail. For those who have not heard this highly educational audio, see m.youtube.com/watch?v=jZPvVwvX_Nc. Listen for the remarkable calm of everyone’s voices.
The crew of flight 1549 employed at least three effective communication techniques during this heroic event. First and foremost, they maintained a sterile cockpit. An FAA regulation prohibits “crew member performance of non-essential duties or activities while the aircraft is involved in taxi, takeoff, landing, and all other flight operations conducted below 10,000 feet MSL” asrs.arc.nasa.gov/publications/directline/dl4_sterile.htm#anchor524636. Crowd control during codes or turning off the radio during the time out are examples of how we create a “sterile O.R.,” but perhaps we need to explicitly introduce this concept during induction and emergence as well.
Second, the crew used Call Outs. During a call out, the speaker clearly and loudly states his or her intended actions. This allows other team members to anticipate next steps, predict their roles in the evolving scenario, object if they think the plan unsafe and even to consider potential hazards. At 15:27:23, Capt. Sullenberger took the controls, stating “My aircraft.” Later in the event, Sullenberger told controllers that, “We can’t do it” and that “We’re going to be in the Hudson,” making clear his intention to bring the plane down on the Hudson River, which occurred about three minutes later.
Had the attending in our case called out the administration of the ephedrine, the resident would probably not have given the phenylephrine. During anesthesia simulation courses, we routinely ask participants to “think out loud” because it facilitates the appropriate reaction from them. This is especially important during a crisis when it is crucial for the entire team to know what each member is doing and thinking.
Finally, the crew repeatedly used what is perhaps the most import communication tool: Read Back (or check-back). Read back is a standard communication practice in other highrisk domains such as aviation and nuclear power. In a read back, the receiver of information immediately and explicitly restates the information he or she heard to confirm hey had correctly understood what was just said. This action gives the sender an opportunity to confirm that the message was correctly received and, if necessary, to correct any errors. This occurred frequently during 1549’s brief flight. One second after Sully calls out, “My aircraft,” Skiles responded, “Your aircraft,” indicating that he had heard, understood and was complying with the command. Nearly every communication between tower and aircraft is repeated throughout the ordeal. It is clear that the read back is as engrained in the speech patterns of the crew as are “please” and “thank you.” Such ingrained behavior can only come from sustained practice and repetition.
Unfortunately, read back is not yet standard practice in medicine. When the surgeon asks for the O.R. table to be raised, how often do we respond with “O.K.,” or worse, have no verbal response, but just raise the table. The correct response is, “Raising the table.” By routinely practicing read back in all non-critical verbal exchanges, airline crews engrain the practice so that it becomes second nature. We could learn from these behaviors.
Communication failures are a significant cause of preventable medical errors.11 The case presented here demonstrates the potential patient safety consequences of communication failures and the discussion outlines several opportunities for improvement in perioperative safety.
Our communication patterns are deeply embedded in the culture of medicine. They are strongly influenced by forces outside individual control (hierarchy, fatigue and production pressure, to name a few). The case presented here highlights the ongoing need for systems-based approaches to improve communication and patient safety. Simply telling clinicians to “communicate better” will be no more effective than telling them to “be more careful.” The aviation industry undertook a deep and broad approach to improve communication in the cockpit, including the development of Crew Resource Management (CRM), simulation-based training and assessment, and fundamental cultural changes. These interventions contributed to a remarkable safety record over the past four decades. Does medicine have the strength to follow this lead?