“…every step of digital quality improvement [the application of information technology to quality improvement], from design to implementation to assessment of outcomes, must involve an understanding of the tight interplay between technology and the care redesign process.”
“I do not fear computers. I fear the lack of them.”
IN this issue of Anesthesiology, Ehrenfeld et al.1 present a technology-based care redesign project to reduce process variation in intraoperative glucose to improve health outcomes in diabetic patients. Electronic health record (EHR) and perioperative information management systems have proliferated over the past 20 yr. Clinical decision support tools have been implemented within these systems to provide anesthesia providers with near–real-time alerts and post hoc reports to enhance patient care processes, documentation compliance, and resource utilization.2 The critical features for successful clinical decision support systems are well known and typically summarized as the five rights: delivering the right information, to the right person, in the right intervention format, through the right channel, and at the right time in workflow.3,4 However, there exists a paucity of anesthesia decision support research, perhaps because of the inherent challenges with these studies and inadequate resources, expertise, or experience among anesthesiology practices to develop decision support research platforms.2,5
Low compliance and high variability in initiating and following institutional intraoperative blood glucose management protocols have been reported at several medical centers,5,6 while studies have shown that perioperative glycemic control decreases surgical site infection rates.7,8 Thus, Ehrenfeld et al.1 implemented an automatic clinical decision support system to identify diabetic patients, detect perioperative insulin administration, check for a recent glucose measurement, and prompt clinicians to obtain an intraoperative glucose measurement. Analysis of 15,895 cases revealed significantly improved rates of clinicians’ glucose monitoring and insulin administration as well as patients’ hyperglycemia in the recovery area. The authors reported no change in the rate of hypoglycemia after the clinical decision support implementation, and surgical infection rates decreased from 1.6 to 1.0% in a propensity-matched cohort analysis (P = 0.02). While Ehrenfeld et al.1 admit that their analysis does not conclusively show their system-level changes caused the drop in surgical site infections, their findings contrast with those from the study by Nair et al.,9 whose clinical decision support reminders to anesthesia providers improved their compliance with institutional protocols yet, a significant improvement in glycemic parameters was not observed. Sathishkumar et al.10 reported improved intraoperative glycemic management behavior with a clinical decision support tool; yet, an impact on patient outcomes was not reported.
Ehrenfeld et al.1 describe in detail how they carried out their project, and at first glance, the notion of implementing a decision support system of this nature seems straightforward: search for relevant patient criteria, create an alert, and then observe as providers’ behavior changes and patients’ outcomes improve. However, the authors used a proprietary, highly customizable perioperative information management system that is not used at the vast majority of other medical centers. This raises two important concepts to evaluate in any clinical decision support study: dissemination and generalizability.11 Some major vendors’ EHR systems may contain functionality constraints that will make recreating the authors’ success an extremely challenging or even impossible task. Yet, while the authors’ decision support tool cannot be disseminated easily, the study’s methods are valid and its retrospective design does not diminish the study’s generalizability. The authors addressed the inherent flaws of their study design by admitting to possible residual confounding and declaring that no educational efforts were initiated or ongoing during the study period.2 The study’s results should encourage clinicians to envision how to leverage technology in a similar fashion within quality improvement initiatives at their own institutions.12
Indeed, the application of information technology to quality improvement—also termed digital quality improvement—has the potential to both augment and accelerate traditional quality improvement processes in anesthesiology.12,13 Clinicians should realize that every step of digital quality improvement, from design to implementation to assessment of outcomes, must involve an understanding of the tight interplay between technology and the care redesign process. As the authors mentioned in the Discussion,1 the design and implementation of technology solutions without ongoing evaluation and assessment can lead to inefficient, extraneous systems that might actually worsen care and outcomes.11 At the moment, digital quality improvement is not yet in widespread use and is limited to a few centers with the information technology expertise to back it up.13 Widespread adoption of digital quality improvement will have to follow the same three steps that classical quality improvement processes must follow in order to be widely adopted: implementation, sustainability, and dissemination/generalizability.11 So far, most studies implementing anesthesia-related clinical decision support as part of an EHR have used local, custom-made solutions that cannot be easily disseminated. The next frontier for this approach should focus on the same concepts but develop decision support tools using commercially available EHRs from major vendors. This may require somewhat daunting amounts of time, money, and technical expertise; yet, clinicians who are pursuing all opportunities to improve patient care should fear a lack of computer involvement rather than computers themselves. While the study presented by Ehrenfeld et al.1 is not a step-by-step blueprint for successful decision support in all hospitals and electronic systems, the authors have shared an excellent example to follow of technological solutions playing an integral role in care redesign and quality improvement initiatives.
Dr. Cannesson is a consultant for Edwards Lifesciences Corp. (Irvine, California), Covidien (Boulder, Colorado), Masimo Corp. (Irvine, California), ConMed (Irvine, California), Philips Medical System (Suresnes, France), and Fresenius Kabi (Sevres, France). Dr. Cannesson is a co-founder of Sironis (Newport Beach, California) and owns patents on close loop fluid management and hemodynamic optimization. The other authors declare no competing interests.