“To improve patient safety and productivity, patients and clinicians need a healthcare information ecosystem with integrated technologies that support the clinician’s work, provide safety nets, and improve productivity.”
PATIENTS continue to die needlessly from preventable harm. Although the exact number is unknown, preventable death is likely the third leading cause of death.1 In addition, long-term healthcare spending and investments are consuming the U.S. budget, crowding out investments in kindergarten through 12th grade education, threatening the country’s future economic prosperity. In principle, reducing healthcare costs seems deceptively simple—improve productivity (lower the cost per unit of service) or decrease the volume of services. In reality, reducing healthcare costs is devilishly difficult. Efforts to restrict health service offerings have stirred strong resistance against “rationing,” driving policy makers to exclude cost analyses from comparative effectiveness research and decrease payments to reduce healthcare costs, believing that squeezing a balloon will allow some air to escape.
Such an approach may prove a quixotic quest. The cost per unit of service may deflate slightly by cutting payment, but it will deflate significantly by improving productivity. Yet, healthcare productivity has not improved.2 Consequentially, cutting payments may worsen quality of care and inflate costs of care. As payments to hospitals decrease, they will cut costs. Nurses will care for more patients, and hospitals will not hire (or, alternatively, will increase the workload of) intensivists, and will cut quality improvement programs and quality-related staff. These cuts will harm patients and increase complications, driving up medical care costs.3
Other industries have significantly improved productivity, largely through investments in technology. In Baltimore, Maryland steel companies, for example, 1,500 workers now produce as much steel as 60,000 once produced. Health care, like education, will never achieve the productivity gains of other industries that require less human service. Still, healthcare productivity can significantly improve through better use of technology.
Health care has invested heavily in technology, yet investments have not improved patient safety or productivity. This is largely because health care leans on the heroism of individual clinicians rather than on safely designed healthcare systems.4 In this essay, we use examples of heroism in health care to outline a path to design safe and productive healthcare systems.5
In the lay media, clinicians are often depicted as heroic figures, working alone to comfort and cure patients. Such a view corresponds well with the practicing clinician’s reality. They use technologies that do not meet their needs, and follow an ever-growing set of well-intentioned regulations and performance measures that consume precious time and frequently add little value.
One major cause of low safety and productivity is the failure to integrate different medical technologies, especially electronic health records and alarms. Critical care clinicians answer a false alarm every 92 s, pulling them from other patients.6 Often, the least important alarm garners the most attention from clinicians, squandering time, and compromising safety and productivity.7
Tragically, there are countless devastating examples of the failure to integrate technology. A 12-yr-old girl died from respiratory arrest as narcotics that were slowing her breathing continued to infuse into her blue, oxygen-deprived veins. She died in large part because the pump infusing the narcotics could not talk to the monitor counting her ever-slowing breaths. If these devices had communicated, the infusion pump would have shut off when her breathing slowed below a critical threshold.
Another patient died from unrecognized sepsis. The proposed diagnostic solution was a paper-based checklist containing nearly 30 data elements. Even though these data elements are routinely in the electronic medical record (EMR), patients depend on busy clinicians to manage and integrate these data elements on paper, then heroically diagnose sepsis. Once diagnosed, we expect clinicians to keep up with the exploding literature, remember the most effective therapies, and monitor whether patients actually receive those therapies quickly. The faster a patient receives antibiotics, the lower their mortality risk. All of this care is practiced with preciously little support from the EMR. Our information systems need software that link the EMR with devices to predict patients at risk for harm, recommend effective therapies, ensure patients receive the recommended therapies, and monitor and learn whether patients get well.
Patients in an intensive care unit (ICU) are at risk for over a dozen types of harm, requiring scores of evidence-based therapies to prevent them.8 Each harm type has multiple therapies and we rely on clinician memory to keep track of every therapy and use them wisely. For example, elevating the patient’s head is one therapy to prevent ventilator-associated pneumonia, but clinicians must guess whether the bed angle is appropriate. Although measuring the elevation with a sensor seems trivial, most ICU beds cannot automatically measure the elevation.
Another complication of ICU care is lung injury from mechanical ventilation. After 2 decades and 2 billion dollars spent, researchers found that lung-protective ventilation can reduce mortality by 10% among patients with acute lung injury. Recent evidence suggests that lung-protective ventilation reduces mortality risk for all patients on mechanical ventilation.9,10 Yet, clinicians use lung-protective ventilation between 20 and 40% of the time in patients with acute lung injury, likely even less frequently in all ventilated patients.11 To ensure compliance, clinicians must adjust ventilator settings based on the patient’s height, largely relying on memory. Unfortunately, the ventilator does not routinely communicate with the EMR, which contains the patient’s height. If the ventilator and EMR communicated, clinicians and engineers could design a system that notified clinicians when the ventilator failed to use lung-protective ventilation or, better still, automatically adjusted the patient’s ventilator to this setting.
Many patients suffer an overdose of narcotics through patient-controlled analgesia (PCA) pumps, as discussed in the 12-yr-old girl’s case. Clinicians either fail to identify a patient receiving too much narcotic or mistakenly program the PCA to deliver a higher dose than the prescribed dose coded in the EMR. To defend against the latter error, most hospital policies require that two nurses manually check every PCA order change against the EMR order. In the ICU, we observed PCA orders changed, on average, four times per patient and it takes 8 to 10 min for one nurse to find another nurse to confirm the orders match. With 20 patients in this ICU, confirming orders relies on heroism and wastes 8 to 10 nursing hours a day, one full time equivalent of nursing time per unit. Ironically, the PCA pump and the EMR have an electronic order for the narcotic dose. If integrated, these devices could automatically, continuously, and reliably confirm matching orders, saving lives and improving productivity.
Deep venous thrombosis and pulmonary embolus are largely preventable complications, which kill 100,000 people annually in the United States.12 A team at the Johns Hopkins Hospital increased appropriate prophylaxis for high-risk patients from 30 to 60% with a paper checklist, and then to greater than 90% when the checklist was automated in the EMR.13 Hopkins is installing a new EMR system without this automated deep venous thrombosis and pulmonary embolus decision support tool. When asked, the vendor reported that it was not a priority. Although Hopkins will work with the vendor to rebuild the deep venous thrombosis and pulmonary embolus prevention tool in the new EMR, most vendors do not include robust decision support tools in their EMR systems. Despite substantial investments in EMRs, such systems fail to support patient safety or clinicians, leaving both to rely on memory- and paper-based tools.
Many patients suffer harm from teamwork- and communication-related problems or errors. During rounds at most hospitals with an EMR, clinicians stand behind computers on wheels, each accessing a different information system (laboratory, radiograph, order entry, medical records), limiting their ability to see each other and view a comprehensive picture of patients. Mobile computers are routinely littered with clinician’s paper notes, listing patients on their service and tasks to do, because the technology does not meet their needs.
Ambulatory care is not exempt from the technology pitfalls in hospitals. Patients dealing with multiple chronic diseases, or even one disease, require, in many cases, scores of therapies. Add to this an explosion of home devices, in which data are not integrated into any platform to make self-care easier for patients and ultimately physicians. For example, high blood pressure kills 200,000 people each year in the United States.* The EMR does not accept data from home blood pressure devices, making patients bring in stacks of paper with blood pressure readings. Moreover, the home monitoring devices are generally not linked to decision support tools to help guide therapy.
In other industries, technology evolved to serve the frontline operator’s needs. Engineers worked with operators to clarify goals, prioritize tasks, and ensure the technology supported their work. For pilots, a cockpit today is much simpler than 30 yr ago. Pilots are supported by automation and built in defenses that make them less prone to error.
Not so in health care. Clinicians are generally given technologies, designed by manufacturers with limited usability testing by clinicians, purchased by managers with little clinician input. These technologies often do not support the goals clinicians are trying to achieve, often hurt rather than help productivity, and have a neutral or negative impact on patient safety. Moreover, the market has not integrated technologies to reduce harm. For example, there are many emerging technologies that limit bacteria growth, a potentially powerful tool to prevent harm from health care–acquired infections. Yet, the market has not helped integrate systems, or designed an ICU that prevents all patient harm, optimizes patient outcomes and experience, and reduces wastefulness.
Hopkins recently built a new hospital, seeking to incorporate the best technology. Although the building facade is elegant, the ICU technology functions nearly the same as 30 yr ago. The ICUs are packed with more, potentially helpful, disconnected devices that do not communicate, that increase rather than decrease the risk for errors, and that depend heavily on heroic efforts by staff, usually reducing rather than improving productivity.
Health care has seen what is possible when it designs safe systems. After an 18-month-old girl died from a catheter-related bloodstream infection, clinicians at Hopkins eliminated these infections by using a checklist and other technologies, by improving teamwork, and by measuring results.14 The Hopkins teams, working with the Michigan Hospital Association, replicated the results in over 100 Michigan ICUs, saving lives, saving money, and restoring joy among clinicians.15–17 Vendors played a large role in this success, making kits containing all the checklist items (in particular, chlorhexidine, a highly effective antibacterial soap) for central line insertion. Within several months, 20% of hospitals with central line kits containing all the checklist items became every hospital with the new kits. Checklist compliance increased, reducing infections, and demonstrating, in a simple way, the power of designing systems for safety.
Although this success is laudable, it targeted one type of harm, while patients are at risk for over a dozen different harms. Yet, current efforts that work on one harm at a time, using a paper-based checklist rather than an automated checklist, are burdensome to clinicians. As a result, hospitals work on only a few harms, leaving patients susceptible to the remaining harms.
To improve patient safety and productivity, patients and clinicians need a healthcare information ecosystem with integrated technologies that support the clinician’s work, provide safety nets, and improve productivity. Research is underway to produce a prototype of a safer ICU. Although we are starting in the ICU, this approach applies to all care areas such as the operating room, ambulatory clinics, and the rapidly growing personal health monitoring industry.
Although this work builds upon previous improvement efforts,15,18 it differs in several important ways. First, it includes patient harm from lack of dignity and respect, a harm that is every bit as real and important as an infection. Second, it works to reduce all harms, bringing together patients, clinicians, engineers, administrators, and researchers. Third, it seeks to design healthcare systems that clinicians can rely on to help improve safety and productivity.
To design such healthcare systems will require a transdisciplinary approach, in which different disciplines collaborate to solve a common problem using an integrated model.19 It draws upon engineering, clinical epidemiology, ethics, psychology, sociology, anthropology, economics, human factors engineering, and informatics. It integrates patients and patient-centered care, the care team, work processes, tools and technologies, the environment, and learning and accountability.
Such a system is possible today and technology is not a barrier. Engineers could accommodate our needs, if healthcare leadership and private sector companies agreed to share data. The Food and Drug Administration will be a vital partner in making this type of system a reality, balancing the need to ensure patient safety with the burden to obtain regulatory approval. Policymakers can also accelerate this migration from heroism to safe design by recognizing that the data belong to patients not vendors, and by requiring that health information technology companies provide Application Program Interfaces. These interfaces make it easier to write applications that predict patient risks, recommend effective therapies, monitor whether patients received those therapies, and learn what worked and what did not work. Provider organizations also have a role. These organizations buy technologies that do not communicate, as if Boeing would continue to partner with a landing gear manufacturer that refused to incorporate a signal to the cockpit informing the pilot whether the landing gear was up or down. Healthcare provider organizations need to exert their collective market pressure and require that the technologies they purchase share data.
We now have hope. In January 2013, nine health technology company chief executive officers pledged to share patient data with clinicians and the patients. The pledge was presented to past President Clinton at the inaugural Patient, Safety Science and Technology Summit.† Over the last decade, health care has struggled to improve safety and productivity. Through collaborations among healthcare providers and the private sector, health care can move from a system based on heroism to one based on safe design.
The authors thank the Gordon and Betty Moore Foundation, Palo Alto, California, which is supporting research to design an ideal intensive care unit, and informed the current essay; and Christine G. Holzmueller, B.L.A., Armstrong Institute for Patient Safety and Quality, Johns Hopkins Medicine, and Department of Anesthesiology and Critical Care Medicine, The Johns Hopkins University, Baltimore, Maryland, for editing the article.
Dr. Pronovost reports receiving grant or contract support from the Agency for Healthcare Research and Quality, Rockville, Maryland, and the Gordon and Betty Moore Foundation, Palo Alto, California (research related to patient safety and quality of care), and the National Institutes of Health, Bethesda, Maryland (acute lung injury research); consulting fees from the Association of Professionals in Infection Control and Epidemiology, Inc., Washington, D.C.; honoraria from various hospitals, health systems, and The Leigh Bureau, Somerville, New Jersey, to speak on quality and patient safety; book royalties from the Penguin Group, New York, New York; and board membership to the Cantel Medical Group, Little Falls, New Jersey. Dr. Bo-Linn is employed by the Gordon and Betty Moore Foundation. Dr. Sapirstein reports receiving grant support from the Gordon and Betty Moore Foundation.
Vital signs: Preventable deaths from heart disease & stroke. September 3, 2013, Centers for Disease Control and Prevention Web page. Available at: http://www.cdc.gov/dhdsp/vital_signs.htm. Accessed October 3, 2013.
The Patient Safety Science & Technology Movement, 2013 Summit Overview. Available at: http://www.patientsafetysummit.org/2013/. Accessed October 2, 2013.