“Wrong surgery” is defined as wrong site, wrong operation, or wrong patient, with estimated incidence up to 1 per 5,000 cases. Responding to national attention on wrong surgery, our objective was to create a care redesign intervention to minimize the rate of wrong surgery.
The authors created an electronic system using existing intraoperative electronic documentation to present a time-out checklist on large in-room displays. Time-out was dynamically interposed as a forced-function documentation step between “patient-in-operating room” and “incision.” Time to complete documentation was obtained from audit logs. The authors measured the postimplementation wrong surgery rate and used Bayesian methods to compare the pre- and postimplementation rates at our institution. Previous probabilities were selected using wrong surgery rate estimates from the observed performance reported in the literature to generate previous probabilities (4.24 wrong surgeries per 100,000 cases).
No documentation times exceeded 5 min; 97% of documentation tasks were completed within 2 min. The authors performed 243,939 operations over 5 yr using the system, with zero wrong surgeries, compared with 253,838 operations over 6 yr with two wrong surgeries before implementation. Bayesian analysis suggests an 84% probability that the postimplementation wrong rate is lower than baseline. However, given the rarity of wrong surgery in our sample, there is substantial uncertainty. The total system-development cost was $34,000, roughly half the published cost of one weighted median settlement for wrong surgery.
Implementation of a forced-completion electronically mediated time-out process before incision is feasible, but it is unclear whether true performance improvements occur.
Surgical never events, including wrong procedure, wrong site, or surgery on the wrong person, remain a persistent problem in operating rooms throughout the world
Use of checklists has dramatically reduced the frequency of injury or death from complications that had previously been frequent
Checklists have minimal impact on operating room efficiency and can reduce costs but often suffer from variable and unreliable use
Implementation of a mandatory, electronic time-out process before incision is feasible, inexpensive, and has minimal impact on operating room efficiency
Given the rarity of wrong surgery, there is substantial uncertainty about whether true performance improvements occur with this approach
SURGICAL never events remain a persistent problem in operating room (OR) suites throughout the world. Roughly half of surgical never events resulting in indemnity payments in the United States can be grouped conceptually as “wrong surgery”1 : wrong procedure, wrong site, or surgery on the wrong person. Wrong surgery never events may be amenable to a checklist intervention to reduce their occurrence.
Two notable applications of checklists dramatically reduced the frequency of injury or death from complications that had previously been frequent.2,3 Recently, studies have challenged checklist complication reductions, finding no decrease in overall adverse event rate, but still found significant benefits in decreasing overall and infectious adverse events.4,5 In the perioperative environment, the universal protocol, including preprocedure verification, site marking, and a hard stop time-out (i.e., a checklist to review the surgical plan), was created and first widely promulgated by the Joint Commission (JC) after its 2003 wrong-site surgery summit to intercept wrong surgeries before incision.6,7 However, checklist interventions must be performed reliably to be successful, which depends on excellent reliability of the team.8–10 Examples of poor reliability include the following: teams do not initiate the checklist, teams perform the checklist from memory, the team responds without looking at the checklist, items are skipped or incorrectly/incompletely performed, and incorrect initiation timing.8,11,12 “Fully implemented” checklists have minimal OR efficiency impact and can reduce the cost per surgical procedure13 but often suffer from variable and unreliable use.14,15
While wrong surgery incidence estimates and estimate methods vary, they may be increasing.16 For example, indemnity payments from a large self-insurance trust indicated a rate of 1:112,994 cases (95% CI, 1:76,336 to 1:174,825).17 This estimate was critiqued as being too heavily influenced by hospital efforts to resolve these events informally to avoid the stigma of reporting. The accompanying editorial asserted a rate of 1 wrong surgery per 5,000 cases (i.e., 20 wrong surgeries per 100,000 cases) that was reported confidentially by multiple community hospitals.18 Another approach is to extrapolate from surgeon self-reporting. For hand surgeons, the approach yields a wrong surgery rate estimate of 1:27,686 cases (i.e., 3.6 wrong surgeries per 100,000 cases).19 A review of regional blockade complications revealed 7 wrong-site blocks in 23,271 cases for a wrong procedure rate of 1 per 3,324 cases (or 30.1 per 100,000 cases).20 A recent systematic review of previously published wrong surgery rates yields a median rate of 1 wrong surgery per 100,000 cases, with a range of 0 to 4 wrong surgeries per 100,000 cases.21
One recent wrong surgery rate estimate of 244 per year comes from a comprehensive search of the U.S. National Practitioner Data Bank from 1990 to 2010 for malpractice payments pertaining to surgical never events.1 Separately, a multicenter retrospective analysis demonstrated that about 12% (15 payments in 130 events) of patients receive indemnity payments when exposed to adverse surgical events.22 Combining the indemnity payment frequency (244 per year)1 with the payment rate (12%)22 yields an overall estimate of 2,117 wrong surgeries per year in the United States. Applying this rate to the estimated 50 million operations performed in the United States per year yields an incidence of 1 wrong surgery per 23,600 operations, for a rate of 4.24 wrong surgeries per 100,000 cases. This is well in line with figures reported from other methodologies. This estimated rate was used to inform the previous probability estimate of wrong surgery rates in a Bayesian analysis of the before and after rate for a rare “never” event targeted by a care redesign quality improvement project.
In early 2010, a wrong-sided surgery occurred at our hospital. This event, coupled with substantial national attention focused on wrong surgery and the value of checklists in 2010, galvanized the perioperative leadership to redouble its efforts around compliance with the JC Universal Protocol. We set out to develop a system that would force compliance with initiating and completing the hard stop time-out, yet fit into the OR workflow. We applied electronically automated process monitoring and process control approaches23 to a time-out checklist combined with a forced-function documentation completion process.
Our development and implementation of the electronic time-out was an attempt to reduce the risk of a rare but important complication: wrong surgeries. The purpose of our report is to demonstrate, in the population of all OR patients, the feasibility of an electronic hard-stop time-out, to assess its impact on OR workflow, to assess the rate of compliance with the new system, and to provide an approximate cost of implementation. We also tested whether the electronic time-out reduced the rate of wrong surgeries in all OR patients relative to the cohort of all OR patients in a retrospective cohort.
Materials and Methods
Local Setting and Problem
We undertook a project to reduce the risk of wrong surgery in the perioperative system of a large academic medical center. Vanderbilt University Medical Center (VUMC) currently operates a total of 75 main campus ORs—53 adult ORs in one contiguous space, as well as 19 dedicated pediatric ORs in a dedicated building, and 3 surgery center ORs in a third location. The Vanderbilt Perioperative Enterprise is governed by a multidisciplinary committee with nursing, anesthesiology, and surgery leadership representation. The perioperative committee functions collaboratively and regards representation of Vanderbilt patients’ and perioperative personnel’s safety interests as a core mission.
The Perioperative Enterprise embraced the JC Universal Protocol,7 including the hard-stop time-out immediately before incision/procedure start. Vanderbilt safety surveys conducted in 2009 were reassuring about staff attitudes and perceptions about safety and the safety culture, indicating their motivation to support and advance new safety projects. Before the electronic time-out was implemented, large (60 cm × 90 cm) laminated placard checklists of the time-out procedure were prominently posted in every OR to facilitate process compliance. However, we commonly noted that not all teams remembered to initiate or complete the hard stop time-out before surgical incision.
Each OR at VUMC uses a computerized documentation system to create the patient’s perioperative electronic medical record and facilitate documentation of both patient care and administrative elements. All perioperative electronic documentation is entered into the same application suite, known as the Vanderbilt Perioperative Information Management System (VPIMS). VPIMS was developed at Vanderbilt, and the Perioperative Enterprise houses an ongoing, active development effort to extend and improve the VPIMS suite’s performance. (A version of VPIMS is available as a commercial product.24 None of the authors, nor Vanderbilt, has any relationship with the vendor.)
Our institution has had large-format, 40″ diagonal liquid-crystal displays (LCDs) installed in every OR since before 2010. These are colloquially called “electronic whiteboards” in homage to the white dry-erase boards they replaced. Their content is served automatically from VPIMS as a module within the suite. The electronic whiteboards deliver multiple levels of patient- and case-related information deemed by perioperative care team members to be relevant and meriting widespread display.
These dynamic (i.e., automatically updating throughout the case) electronic whiteboards are mounted in every OR. The information on the electronic whiteboard is either displayed from the nursing documentation computer onto the electronic whiteboard as a second monitor or run independently in a computer embedded in the LCD electronic whiteboard display. These have traditionally shown team member names, patient name, primary diagnosis, the operation to be performed, and a medical history abstract. The display has evolved over time and now includes a notification that critical lab values exist, blood product warnings, patient allergies, medications, high-level patient problem list, OR personnel identities, and between-case throughput metrics. Data that are deemed critical elements, such as patient allergies, are always displayed. The display of other data may change based on the phase of the case, and it may also change automatically as routine clinical documentation tasks that serve as display-change triggers are completed contemporaneously.25 Similar displays have been developed and demonstrated at other institutions, justifying the extensible feasibility of this approach.25
Previously, we have described a VPIMS-based OR communication system26 to push information to providers. We have also developed a hierarchy of information provision to improve quality in the OR.23 Historically, most healthcare information is concealed, accessible to only those who seek out the data; however, creating transparency makes the information more readily available to the team. Integration brings information from disparate systems together automatically and displays it for ready consumption with contextual cues from the totality of the information provided. Augmented vigilance actively pushes key information to the correct provider automatically and in proper temporal context to be useful. In our conception, decision support enhances the value of information by delivering recommendations along with the integrated information. Automated process monitoring and process control adds the most value of all, by combining augmented vigilance with decision support and requiring a response from the receiving clinician.23
In related work, we have shown that automated process monitoring and process control produces lasting improvements in process performance27 and reduces the rate of infrequent but undesirable events.28 We have also demonstrated that automated process monitoring and process control can have a direct positive return on investment, improving the financial case for implementation.29 However, we had never built a forced completion step into the workflow, so our previous interventions could reduce undesired clinical events but not eliminate them.
Intervention Development and Initial Deployment
The initial development of the electronic time-out application was undertaken in collaboration with OR personnel. To encourage compliance with documentation tasks, the electronic clinical documentation workflow is organized chronologically with tabs running across the top of the documentation screen. Significant case time milestones are persistently shown in a right-sided panel on the display (fig. 1). Because a successful time-out is time sensitive (completed before procedure start), we inserted the time-out documentation step between the “anesthesia ready” (i.e., time after induction of anesthesia has concluded and the patient has been released to the surgical team) and “procedure start time” (i.e., incision) milestones. Collaborating with OR staff, we included workflow restrictions to encourage documentation completion. Specifically, we designed the system so that the documenting nurse could not record that incision had occurred or the procedure started until the time-out documentation was completed, and without incision time documented, “cut-to-close” nursing documentation could not be completed.
A sample of the steps to complete the time-out process is shown in figure 2 that represents the documenting nurse’s view seen on his or her computer during the process. In the initial iteration of the project, the entire checklist also appeared on the room-view large format display, with all items colored red. As the documentation steps were marked “completed,” the corresponding item’s checkbox and text displayed in the OR view turned from red to black, and an “x” in the checkbox preceding each item changed to a checkmark. Figure 3 illustrates the preintervention state and the first-generation electronic time-out in use.
We deployed the electronic time-out checklist in Vanderbilt’s 63 main ORs beginning July 30, 2010, and presently use the checklist in our 68 main ORs (as of April 5, 2015, the date of initial manuscript writing). These OR locations include two adult OR suites and one Children’s Hospital OR suite, described previously. The patient selection criterion was any patient having a surgical or endoscopic procedure performed in these OR locations. VUMC does not utilize regional anesthesia block rooms but, rather, performs regional anesthesia at the holding room bedside. Consequently, electronic time-outs were not implemented for regional anesthesia procedures. We attempted to address bias by including every patient operated on in the ORs where the system was deployed, and the system was deployed in every main-campus OR. A negative outcome was defined as one where a wrong surgery was performed. A wrong surgery was defined as a wrong operation, a wrong site, or wrong patient and is identified through Vanderbilt’s normal quality assurance reporting processes, as well as informal sources. Given the attention focused on OR “never” events and the number of people involved (at least two nurses, two anesthesia providers, one surgeon, and one surgical assistant), it is difficult to envision a wrong surgery going undetected.
We conducted a secret shopper study of this electronic time-out system.30 The results indicated improved checklist performance, but these secret shoppers noted that it was possible to click through to a completed time-out without fully paying attention to each step. Thus, we undertook an iterative improvement of the application.
We developed a second-generation electronic time-out application, October 19, 2011, with a visual presentation to the in-room team, designed to focus the team on the individual steps of the time-out, one at a time. Specifically, the application presents each step individually as the only element visible in the center-screen work place of the electronic whiteboard (fig. 4). The documenter’s input interface is unchanged in the new version of the application.
The time-out currently comprises 13 questions. Each question must be marked “confirmed,” indicating that the item was brought up for discussion and considered by the team. Questions marked “not confirmed” can be revisited through the interface and displayed on the whiteboard and addressed. If the time-out is concluded with any question marked “not confirmed” or unanswered upon submission, the time-out is cancelled, requiring the team to correct the issue and perform the time-out again from the beginning.
The time-out is initiated by the circulating nurse (or sometimes the surgeon) as an announcement to the entire room to secure attention. The team then attends to and checks off each item, one-by-one, with call and response input from team members as their shared items are addressed.
Vanderbilt is a level-1 trauma center, so we also developed a three-question abbreviated time-out, as shown in figure 5. Some patients can present to the OR so emergently that missing information must not delay intervention, so marking questions as “not confirmed” in this alternate pathway does not prevent time-out completion. This alternate time-out is only presented to the OR team when a patient is scheduled in the OR scheduling system as an emergency case. Assignment of emergency status is the prerogative of the surgeon and reflects the surgeon’s assessment that patient operative intervention is required so quickly that the time to complete routine steps such as preanesthetic optimization, patient identification, or site marking cannot be allowed to delay the operation to avoid irreparable harm. Also, these emergency surgery patients are brought directly to the OR by the surgeon, never breaking the chain of patient custody.
Evaluation and Analysis
Retrospective review of the system’s performance was approved by the Vanderbilt University School of Medicine’s Institutional Review Board (Nashville, Tennessee). We included all 243,939 cases performed between July 30, 2010, and April 5, 2015, in the postimplementation group. We measured the frequency of the final time-out entries that occurred before the documented procedure start time (incision), which represent successfully completed time-outs. Our analysis revealed that some cases were cancelled before incision (which would not require a time-out) but had nursing documentation performed. Further, we identified cases where a time-out appeared to have been performed after the procedure start. A review of audit trails showed that all of these were either due to a second time-out with a second surgical team or aberrant (accidental or erroneous) selection of the time-out function after incision by the documenting staff, which will record a second time that cannot be reverted to the original time. Rolling the stated procedure start time back before the time-out is a final possibility. However, this behavior would have been recorded by our audit logs, and was not observed. These scenarios are not deliberate noncompliance, but instead are either acceptable and expected alternative patient care pathways and documentation events or accidents. We computed the duration of time-outs using our audit logs and categorized them as less than 2 min or greater than or equal to 2 min.
The number of wrong surgeries was calculated by counting the number of cases where the nurse recorded a difference between the site, patient ID, and procedure performed as documented in the medical record versus the scheduled case, consent, and preoperative documentation as part of routine end-of-case documentation. Further, queries of our quality assurance, institutional event reporting, and risk management systems were made, seeking wrong surgery reports. Finally, we queried the risk management office for reports of wrong surgeries from the period from July 1, 2004, to June 30, 2010 (6 yr, 253,838 cases), reported before the deployment of the electronic time-out system. During the period of the study, there were no changes in the organizational structure, technology used, nor key personnel in the institutional reporting systems.
Finally, we calculated the cost to develop the application from work process documentation maintained by the VPIMS developer group.
We report descriptive statistics for the rate of wrong surgeries. National wrong surgery rates were estimated from published sources as described in the Introduction to estimate the previous incidence of wrong surgery. The previous distribution of wrong surgery incidence was selected from the class of β distributions such that there was 5% previous probability that the incidence was less than 1 per 250,000 procedures. Wrong surgery counts were modeled as binomial random variables. Thus, the posterior distributions for the pre- and postimplementation wrong surgery rates were also β-distributed.31 Posterior pre- and postimplementation rates and their ratio were summarized using the posterior mean, equal-tailed 95% credible intervals, and other posterior quantiles, where summaries of the rate ratio were computed using a Monte-Carlo method. Statistical analyses were implemented using the R functions qbeta (β quantiles) and rbeta (β random sampling).32 For Monte-Carlo summaries, 1 × 106 samples were drawn from both the pre- and postimplementation posterior (β) distributions. The mean and quantiles of the ratios (post/pre) of sampled values were then computed.
We implemented electronic time-outs for cases in our 69 main campus ORs in July 30, 2010. The second-generation time-out procedure was released to all ORs on October 19, 2011. The time-out procedure and application continues to run beyond the end of the study period, April 5, 2015, by which time there were 75 main campus ORs. All 243,939 main campus OR cases between July 30, 2010, and April 5, 2015, were subject to the electronic time-out procedure, without exception. Of those time-outs, 97% were completed within 2 min, and all time-outs were completed within 5 min.
Time-out failure can be defined as a time-out that did not occur for a case that was actually started by the surgeon/proceduralist (as opposed to cancellation before procedure start) or a case where the time-out occurred after the start of the procedure. Time-out failure rates can be estimated by review of records for performance tracking purposes or by “secret shopper” observation. In our system, failures occurred at a rate between 1 per 1,250 cases (VUMC, Center for Clinical Improvement, performance tracking data) and 1 out of 140 cases that were observed in detail (Unpublished results: direct observations of surgical time-outs performed on 140 nonemergent surgeries between September 1, 2012, and April 30, 2013; Catherine Bulka, MPH, Nashville, Tennessee). We reviewed the audit logs of time-out failures documented by the Vanderbilt Center for Clinical Improvement. Because of design features of the system, time-out failures appeared to be from time-outs occurring after procedure start, because (1) the system’s forced functions require an “In Room” time entry; (2) the time-out time-stamp can only be the current time, since the time-outs are to be performed and documented in real-time; and (3) the time-out process must be documented before a “procedure start” time value can be entered.
Upon audit log review, every time-out time stamp that was after the procedure start time also had a time-out completion time preceding the procedure start time. In other words, these cases had two documented time-outs, one preceding procedure start and the second documented after procedure start. In some instances, it was apparent that a second time-out was intentionally performed. In the other instances, it was apparent that the time-out button was accidently clicked after procedure start. Because of the necessary forced functions, this postprocedure start time-out time entry could not be reversed. As a result, the system forced the completion of a second (superfluous), postprocedure start time-out documentation sequence. The review revealed that regardless of the reason for the second time-out, preprocedure time-outs occurred in all cases.
Between July 30, 2010, and April 5, 2015, there were no wrong surgeries (0 in 243,939 cases) in the ORs where the electronic time-out was deployed. In the 6 full years before deployment, there were two wrong surgeries, as defined, known to the risk management team (2 in 253,838 cases). We conducted a Bayesian analysis of the probability that the postimplementation rate is lower than the preimplementation rate, using estimates reported in the literature (as reviewed in the Introduction) to select reasonable previous estimates. We set the previous mean at 4.24 wrong surgeries per 100,000 cases and then allowed a 5% chance that the never-event rate is smaller than 1 per 250,000 (roughly 2 among all the cases observed in our sample), consistent with the low-end estimates from the literature reports. This also allows about 0.5% chance that the never-event rate is greater than 20 in 100,000 (consistent with the 1/5,000 rate asserted as a high-end estimate). Using these observed and previous incidences, the posterior estimate for the July 1, 2004, to June 30, 2010, period is 1.17 wrong surgeries per 100,000 cases (95% CI, 0.27 to 2.71), whereas the estimate for July 30, 2010, to April 5, 2015, is 0.48 wrong surgeries per 100,000 cases (95% CI, 0.03 to 1.58). Thus, the estimated wrong surgery rate is about 41% less during the postimplementation time period (post/pre ratio, 0.59; 95% CI, 0.02 to 2.62). However, given the rarity of these events and the total numbers of cases, there is substantial statistical uncertainty. As a consequence, Bayesian analysis suggests an 84% probability that the postimplementation wrong rate is lower than baseline, but there is a 16% posterior probability that the postimplementation wrong surgery rate is actually greater than baseline.
The total cost of the whiteboard and first-generation time-out application development was $23,800 for application development, representing 35 person-days. The second-generation time-out application development cost an additional $10,200, representing 15 person-days. The time-out project was deployed on hardware that had previously been purchased for electronic whiteboards. To implement this system without preexisting computer hardware, adding a 40″ monitor with an embedded operating system (Samsung, South Korea), monitor mount, power and Ethernet cable installation, and labor would cost approximately $2,500 per room.
The implementation and reliable use of an electronic time-out checklist is an example of technology developed to support and enforce a thoughtfully developed perioperative systems design element. Perioperative systems design describes a rational approach to managing the convergent flow of patients from disparate physical and temporal starting points (frequently home), through the OR, and then to such a place and time (home or hospital bed) where future events pertaining to the patient have no further impact on OR operations.33 Perioperative systems design naturally attends to quality and safety, as these are foundational to smooth workflow. In this project, our goal was to create a system wherein a hard stop time-out reliably occurred before incision and procedure start, seeking to reduce or eliminate wrong surgery.
After implementation, our observed wrong surgery rate is 0 in 243,939 cases, but the previous 6 yr also had excellent performance, with only two documented wrong surgeries in 253,838 cases. Thus, we cannot conclude (with 95% posterior probability) that the electronic time-out process actually improved performance with respect to the targeted patient outcome. However, we would have expected between 10 and 73 wrong surgeries among the 243,939 operations reported if national estimates of wrong surgery incidence are correct. The postimplementation incidence of wrong surgery at our institution is smaller than these estimates with greater than 99.9% posterior probability. In addition, we now have documentation and an audit trail of the time-out process for every case performed under the new system.
Our project and the empirical observations arising from it have several important limitations. First, it was conceived as an observation/intervention study, with no control arm. Purely secular changes in performance elsewhere in the system could explain the entire result. We also have created a system that still allows less-than-perfect execution. Initially, time-outs could be documented after the actual beginning of surgery, and some of the stated times were subsequently rolled back to before the documented start of surgery. The second iteration of the time-out application (implemented in October 2011) removed the ability to edit the time-out time stamp. Post hoc documentation and editing the time stamp of an intended “live documentation” event is outside the norm of expectation for our documentation practices, and we eliminated the possibility of editing the time stamp in the second generation of the application. Nonetheless, we achieved the observed process and outcomes performance improvement (no wrong surgeries) despite these initial design defects.
Our main result arises from routine documentation by nursing staff in the OR, and wrong surgeries could be missed or even misrepresented. Our experience at Vanderbilt with previous wrong surgery events is that they inevitably come to our attention through the OR quality improvement system.
We initiated this electronic time-out project in response to a single, notable event at our institution. When the project was initiated in 2010, we were not able to widely report local wrong surgery rates from (at the time) recent previous years due to legal reasons. Hence, we turned to national estimates, which can be imprecise and might not apply to the local setting. However, the use of available nation-wide performance estimates are more useful comparators than our single-institution reported rate for a center considering adopting our approach.
Electronically Framed Time-out in Context
While wrong surgeries are rare, they degrade the effectiveness of the healthcare system. The authors of the works previously cited point out that wrong surgery often disfigures or kills the patient1 and is psychologically devastating to the OR team. Thus, wrong surgery is still a significant health outcomes problem despite years of effort to reduce its incidence.
The electronic time-out component of the system has been well accepted, fits seamlessly within the clinical workflow, and is scalable (questions can be added as needed). We also can construct a plausible case that the application has been cost saving, since even one typical indemnity payment for a wrong surgery (of which we might have experienced at least 10) exceeds the cost of the system. Because the system has been part of routine workflow and documentation for the past 5 yr, we can be confident that its implementation has been sustained and is sustainable.
Our project implementation leveraged the existing infrastructure, including computers and large LCDs already installed in the OR, so the overall cost was low. To determine the cost and value calculation for installation in a more typical OR suite, we assumed that all modern ORs have at least one computer capable of running the electronic time-out application. Thus, the incremental cost of installation, presuming a commercial product could be identified, would be the cost of the large-format LCDs and the cost of the software license. To equip our ORs fully, assuming no large LCDs before the project, the cost would be under $250,000 for 69 ORs. The weighted median cost for wrong surgery (from Mehtsun1 ) is $75,552 per instance, which is more than our actual cost ($34,000 for software development) to implement. We considered the additional expense of training to improve compliance. However, the time-out was a “hard stop” and was required in order to document the remainder of the case, and notification of the system change was completed in a customary fashion for Vanderbilt (dissemination via e-mail and regular meetings), meaning no real additional cost. In other words, Vanderbilt could recover the marginal cost of electronic time-out implementation with the prevention of the first complication. Even for a hospital of Vanderbilt’s size, performing roughly 50,000 cases per year, starting without large format displays but performing four wrong surgeries per 100,000 cases, the time to return on investment would be approximately 2 yr. On the other hand, if the rate was 1 per 100,000 cases, it may be impossible to prove any improvement.
Perioperative quality improvement at Vanderbilt frequently focuses on improving communication by creating information transparency and improving reliability through automated process monitoring and process control.23,26–28,34 The electronic whiteboard and the input screen of the documenting nurse’s computer provide the transparency. Process monitoring and control utilize software written by the VPIMS developer group and compare actual clinical progress against a model of expected events. The process model itself can often be very simple to state—e.g., “performance of a preoperative time-out should be documented” or “the patient should go only to their scheduled OR,”34 or the “the blood pressure should be documented before induction of anesthesia.”35 Process exceptions should be promptly flagged to the right clinician—namely the team in the OR. Our time-out approach harkens back to Deming: to allow the typical clinician to reliably achieve the desired level of performance by providing the means (through better tools, materials, methods, and systems) to reliably improve performance.
In conclusion, implementation of an electronically mediated time-out process before incision, with workflow constraints to force documentation, was associated with fewer (no) wrong surgeries than would be expected from national rate estimates, but there was insufficient evidence of a difference from our own previous performance. Our cost of system development and implementation was less than one weighted median payment for wrong surgery1 and, hence, is likely to be at least cost-neutral.
The authors thank all of the application developers who worked on the electronic time-out application and all of the perioperative care team members who have diligently used it during development, implementation, and introduction to routine use.
Supported by operating funds of the Vanderbilt University School of Medicine, Nashville, Tennessee. Manuscript preparation was supported by funds from the Department of Anesthesiology, Vanderbilt University School of Medicine, Nashville, Tennessee. Also, supported by the Foundation for Anesthesia Education and Research (Schaumburg, Illinois; to Dr. Wanderer) and the Anesthesia Quality Institute Mentored Research Training Grant-Health Services Research (Schaumburg, Illinois; to Dr. Wanderer). These funding sources did not direct the specific design of the computer application, the implementation of the application, or the study design. The study design and the analysis and interpretation of the data are solely the work of the authors, as was the decision to submit the work for publication.
The authors declare no competing interests.