Automated medical technology is becoming an integral part of routine anesthetic practice. Automated technologies can improve patient safety, but may create new workflows with potentially surprising adverse consequences and cognitive errors that must be addressed before these technologies are adopted into clinical practice. Industries such as aviation and nuclear power have developed techniques to mitigate the unintended consequences of automation, including automation bias, skill loss, and system failures. In order to maximize the benefits of automated technology, clinicians should receive training in human–system interaction including topics such as vigilance, management of system failures, and maintaining manual skills. Medical device manufacturers now evaluate usability of equipment using the principles of human performance and should be encouraged to develop comprehensive training materials that describe possible system failures. Additional research in human–system interaction can improve the ways in which automated medical devices communicate with clinicians. These steps will ensure that medical practitioners can effectively use these new devices while being ready to assume manual control when necessary and prepare us for a future that includes automated health care.

Advanced medical technology is routinely used in the practice of medicine and automated medical devices are beginning to appear in the clinical environment. At first glance, automated systems are a powerful tool that physicians can use to prevent human error and improve patient care. Computers do not become fatigued or distracted and they recall information almost instantaneously. For example, closed-loop drug delivery systems may be safer and more accurate than physicians who administer drugs by hand.1,2  Electronic health record systems provide automated sepsis alerts and clinical recommendations based upon laboratory values and vital sign measurements. Recent studies describe robot-assisted intubation,3  robotic transesophageal echocardiography,4  and experimental oral surgery that may be performed without human intervention in the future.5  These advances portend a future in which patient care will be largely automated, with humans supervising autonomous medical devices.

The adoption of automated systems in the clinical environment poses challenges that must be addressed in order to maintain and improve patient safety. Medical devices have become increasingly complex and can fail in unexpected ways. Systems such as physiologic monitors, anesthesia machines, and intensive care unit (ICU) ventilators are controlled by software that is designed by biomedical engineers but managed by the clinician. The introduction of automated medical devices may cause previously unanticipated changes in workflow as the clinician is now expected to monitor for failures instead of performing the task him- or herself. Skill at performing a task manually will atrophy as practitioners become reliant upon an automated system; as a result, manual skills may diminish in humans who rely heavily on automation to manage system failures.6  In this article, we will discuss how automated medical technology affects the practice of anesthesia. We will also describe how other industries have dealt with the unintended consequences of imperfect automation, including under- or overreliance, degradation of manual skills, loss of trust in the automation, and management of system failures.7  We will then offer practical advice on how to use this understanding to improve patient safety.

The recent Boeing 737 Max 8 (The Boeing Company, USA) mishaps are perhaps the most recent example of how the malfunction of a new automated system combined with inadequate training can lead to a tragedy. The Max 8 had a tendency to pitch up when power was applied because its larger engines were mounted farther forward than had been necessary in previous models. A software workaround (Maneuvering Characteristics Augmentation System) was created that would automatically lower the nose if the airplane appeared to be getting dangerously slow. This critical information was provided to the software by a single sensor that is prone to failure. Pilots who transitioned to this airplane were not informed about the aerodynamic problem with the airplane or the software that was designed to address it. Still, the airplane was certified and entered commercial service. During two flights (Lion Air Flight 610 and Ethiopian Airlines Flight 302), failure of the sensor caused the system to abruptly pitch the airplane down shortly after takeoff. The pilots recognized the problem with the flight path, but were not able to trace the failure to the Maneuvering Characteristics Augmentation System automation in time. Both airplanes crashed, causing the deaths of 346 people.

The high level of safety in commercial aviation depends partly on pilots being trained to recognize and respond to problems with automation. In these instances, however, the unique circumstances of the sensor failure, combined with pilots’ unfamiliarity with the system, led to disaster. During its investigation, the U.S. National Transportation Safety Board discovered significant flaws in the design process, the evaluation of the technology, and pilot training. At present, the entire fleet remains grounded.8  This is just one example of how errors in design, implementation, and training can result in the catastrophic failure of automated systems.

The Anesthesia Patient Safety Foundation has defined advanced medical technology as “medical devices and software systems that are complex, provide critical patient data, or that directly implement pharmacologic or life-support processes whereby inadvertent misuse or use error could present a known probability of patient harm.”9  Advanced medical technology that includes automation can make clinical care more efficient and improve patient safety because machines can accomplish many tasks more efficiently than humans. Machines never become bored or tired, nor are they biased or delayed by emotional responses to a critical event. Machines may be more specific and sensitive than humans in detecting subtle changes in a patient’s status.10 

Perhaps the most obvious example of how medical technology has evolved in the practice of anesthesia is the anesthesia machine itself. Anesthesia machines began as a simple means of delivering a mixture of compressed gases and volatile anesthetics to the patient. Typically, a simple bellows ventilator might have been driven by compressed oxygen, regulated by valves that were electrically or pneumatically triggered. The correct direction of the flow of gases in the circle breathing system was ensured by mechanical one-way valves, visible under clear plastic caps. The simplicity of the machine and the fact that all its working parts could be seen were considered to be important components of its safety.11 

As the anesthesia machine has become more complex, however, many of its formerly visible components have been hidden or replicated with software. The Draeger Perseus A500 (Draeger, Inc., USA) is an example of the latest generation of anesthesia workstations. It uses an electronically-controlled turbine to ventilate the patient. The system is controlled by the clinician with a touch screen interface that has several layers of menus. Because the turbine is nearly silent, artificial breath sounds are generated and played through a loudspeaker to provide auditory feedback to the clinician. Most of the components, including the one-way valves, are hidden from the user. Although this design confers many advantages, including advanced modes of ventilation, some aspects of the system’s operation may be difficult for the user to understand. This can possibly make it more difficult for the clinician to recognize and troubleshoot malfunctions in the machine.

Automation is defined as a machine that either carries out or augments a function that was previously performed by a human.12  Automated systems can be significantly faster and more efficient than humans at many tasks. Automation has already become part of daily medical practice and will only become more prevalent in the future. Ventilators, medication administration systems (i.e., infusion pumps), and diagnostic equipment employ varying levels of automation.13,14  One recent report describes a deep learning algorithm that is better able to predict the response to propofol and remifentanil infusions than established pharmacokinetic models.15  Closed-loop control of anesthetic agents, fluids, and ventilation has been found to produce better neurocognitive outcomes than manual control of anesthesia.16  Another recently published study demonstrated that computers can triage chest radiographs in real time.17  These developments are only the first examples of how advances in technology will expand the potential for intelligent medical devices. As sensors and algorithms become more sophisticated, it is possible that machines may one day be able to evaluate and treat patients and perform procedures autonomously while under human supervision.

Introduction of automated medical devices will not eliminate the need for human operators, but the nature of their work may be changed in ways that are difficult to anticipate.12  Automation does not necessarily make a process more reliable; it may replace operator error with design error.18  Clinicians must therefore trust automated systems to perform correctly while maintaining vigilance for rare but potentially catastrophic failures. Anesthesia professionals who use these systems can improve patient safety by understanding how automated systems work and the unique challenges imposed by working with machines that use sophisticated algorithms to perform patient care. For example, operators in industries such as transportation and nuclear power routinely train for system failures (when a device stops working or hands operation back to the user) or automation surprises (in which a device takes an action that is unexpected by the user). Few devices that employ automated medical technology currently approach the high levels of automation seen in other industries, but any medical device can fail in unexpected ways, requiring the clinician to quickly intervene to prevent a poor outcome. Training for clinicians should include techniques similar to those used in the aviation and nuclear power industries and can be guided by the already established science of human systems integration.

The introduction of automated medical technology into clinical practice introduces the potential for errors that can be caused by the device, the clinician, or the human–machine interface. In addition to system malfunctions, failures of medical devices can be caused by several factors: the user may have programed the system incorrectly; the process may have been designed incorrectly; or a component within the device may fail. Whenever a person interacts with a highly automated system, the original workflow is replaced by new tasks that include supervising the device and troubleshooting associated problems. These new skillsets must be acquired through training.19  For example, nearly all passenger airplanes can be flown by the autopilot from shortly after takeoff through approach and landing, causing the role of airline pilots to evolve into that of a system supervisor.20–24  In addition to learning hand-flying skills, pilots must now be trained to monitor the automated systems and to take over control if necessary.25  A similar evolution in the skills of healthcare providers may be necessary with the widespread adoption of automated medical technology.

One example of how human error, malfunction of an automated system, and skill atrophy can interact to cause a disaster is the Air France Flight 447 crash. While flying through an area of thunderstorms, one of the airplane’s sensors became covered with ice, causing a malfunction of the airplane’s airspeed indicator. As a result, the autopilot unexpectedly handed control back to the flight crew. The flight management system reverted to a mode called “alternate law” in which many of the protections built into the flight management software are disabled. The flight crew was unprepared to hand-fly the airplane and no longer understood which functions were automated and which required manual control. The crew had not been trained to manage a partial automation failure combined with an airspeed well below the normal bounds of operation. Their hand-flying skills were rusty from disuse. Confusing messages on the electronic centralized aircraft monitor further impaired the flight crew’s ability to regain control of the aircraft, as did confusion as to who was actually flying the airplane.26  As a result, a flyable airplane crashed into the Atlantic Ocean, killing everyone aboard.

Well-designed systems can improve patient care when used by clinicians who have been properly trained. For example, computerized provider order entry and clinical decision support systems commonly offer automated medication recommendations. These systems can help to decrease medication prescribing errors and reduce mortality in the ICU.27  Commercially available closed-loop ventilators automatically adjust parameters in order to ventilate patients with lower pressure, volume, and Fio2 than conventional ventilators.13  Unintended consequences and new errors can, however, arise from the implementation of automated systems. These may be related to design flaws, new workflows, inadequate training, or problems with the human–system interface. In one example, a recent study on computerized provider order entry prescribing errors in elderly patients found that the majority (96%) were due to human–machine interactions.28 

Interaction with an automated system is affected by a clinician’s experience and confidence in his or her skills. Clinicians who have less experience and confidence in their task may overly rely on automation, whereas those with more experience may choose to bypass the automated technology altogether. Less experienced physicians are more likely to change their decisions based on automated prompts when using computerized decision support systems.29 

Varying levels of automation complicate the human–machine interface. Few automated systems offer a specific choice in which either the human or the machine exclusively performs a given task; many systems use an intermediate level of automation. For example, a self-driving car could recommend a change in the planned route, but the human driver would be required to acknowledge that road conditions were appropriate before the car would change direction. In medicine, an electronic health record can offer a drug recommendation, but it then requires that the clinician choose whether to accept it. Removing an operator completely from an automated task decreases performance recovery when an automated system fails.30  An intermediate level of automation maintains operator involvement, may improve situation awareness, and can reduce the risk of performance impairment. Incorporating this intermediate level of automation in medical equipment may therefore improve safety. The user interface should ideally be designed to help the operator to maintain situation awareness in order to detect failures early and facilitate troubleshooting.

The electronic health record is one example of how automated medical technology has been introduced into clinical practice. It has improved many aspects of patient care, increasing the legibility of records, providing clinical decision support, and allowing patient data to be aggregated for research. Poorly designed or implemented electronic health records can, however, be a source of distraction, increased workload, and error.31,32  Some studies suggest that the alerts provided by electronic health records may not improve care. For example, at least one systematic review failed to find evidence that sepsis alerts improved measures of treatment.33  Large numbers of alerts with low positive predictive value, especially if they are for low-stakes problems, cause alert fatigue and may be ignored by practitioners.34  Alerts from electronic health records related to laboratory results, medication refills, and other reminders cause an increase in workload and may be irrelevant in the operating room. Repeated alerts, especially for the same patient, cause a decrease in physician response and increase the number of overrides.34  In contrast, overreliance upon a clinical decision support system increases the risks of both failing to detect prescribing errors and accepting false-positive alerts (thus prescribing the wrong medication).35  Strategies to mitigate these problems are currently under investigation. In order to relieve cognitive stress and burnout in physicians using electronic health records, for example, Gregory et al. recommend protected time for alert management, fewer alerts, and other electronic health record improvements.36  The performance of clinical alerts can be improved by disabling low-stakes alerts and by changing the criteria for an alert to increase its relevancy.37  This suggests that the ability to temporarily disable most electronic health record alerts during critical periods (e.g., while the patient is in the operating room) may improve safety.

Automated systems can lead to skill atrophy, trust failure, system failures, automation surprise, mode confusion, automation bias, and boredom, all of which can impair a clinician’s ability to safely use this technology. The challenges posed by these problems can, however, be mitigated by understanding why they occur and how clinicians can be prepared to manage them.

Skill Atrophy

Overuse of automation may result in a loss of manual skills.38–41  Overuse may be caused by the operator placing high levels of trust in the automated system,42  a tendency to become over-reliant on the system,12  or complacency.43  Although the effects of automation on the loss of clinical skills have yet to be investigated, some authors have expressed concern that surgeons whose practice consists primarily of minimally invasive procedures may lose their ability to convert to an open procedure if necessary.44  Neurology residents may be losing their ability to conduct detailed neurologic examinations because they now rely upon advanced diagnostic imaging to make a diagnosis. As a result, residency training programs in neurology are beginning to discuss inclusion of the physical examination in their curricula so that clinicians can learn and maintain this important skill.45  General surgical residents are also facing this problem: as they perform an increasing number of laparoscopic and robotic procedures, they may ultimately lose the technical skills to safely perform an open surgical procedure.46 

Trust Failure

An operator who does not trust an automated system will be reluctant to use it.12,47  The mid-air collision of Bashkirian Airlines Flight 2937, a Tupolev Tu-154 (Tupolev, Russia), and DHL Flight 611, a Boeing 757, over Überlingen, a small town near the Swiss border in 2002 was the result of multiple factors, including lack of vigilance by air traffic control, poor staffing, and conflicting regulations. The two aircraft were on a collision course and were about to lose separation, but this was not immediately noticed by the controller responsible for both aircraft. The controller mistakenly commanded the Tupolev to descend while the Traffic Collision Avoidance System in the Tupolev issued a “climb” instruction. The captain, who did not trust the automated system and was possibly unaware that the Boeing crew was receiving a complementary instruction that would have avoided the collision, elected to descend. The two airplanes collided in mid-air, killing 71 people.48 

Successful human–machine interaction relies heavily on the automation performing as expected and the operator’s understanding of what the system is doing. A clinician may lose trust in a system that is unreliable and will therefore be less likely to use it. For example, clinicians may silence or disable alarms that have high false positive rates, preventing them from successfully averting unwanted outcomes signified by the few true positives for that alarm.12,49  Similarly, if a clinician does not trust a clinical decision support tool, he or she may ignore or override suggestions, decreasing the benefits that automation can provide in patient care.

In both emergency medical transportation50  and aviation,51–53  poor perceptions can also degrade the user’s trust in the system. Human operators become reluctant to use a system that is plagued by automation failures and false alarms.54–56  Moreover, a failure of one automated system can lead to decreased trust in a similar system that is working properly.57,58 System wide trust failure describes the resultant loss of trust that impairs the human–automation interaction.59–62  As automation is increasingly incorporated into clinical practice, our specialty must take the lead in learning how to effectively use automated clinical systems while encouraging the public to trust new medical technology. This will also include addressing patients’ apprehension when a new medical technology is introduced.

System Failures

Failure of an automated system may be caused either by malfunction of the device or by operator error and may suddenly increase a clinician’s workload.63  For example, a ventilator used in many ICUs was recently recalled when a life-threatening software malfunction was discovered. Ventilators with the defective software would suddenly stop ventilating the patient and discontinue supplemental oxygen. The error message indicating that this failure had occurred was “panel connection lost.” In order to prevent patient harm, ICU staff were required to recognize this problem immediately and convert to an alternative mode of ventilation.64  Even if the clinician has maintained his or her skills, he or she may become overreliant on an automated device and fail to notice a problem.65  (Fortunately, the aforementioned ventilator sounded a high-priority alarm when it failed.)

The operator’s trust in an automated system may cause him or her to allow an increase in the number of distractions66  or cause undesirable behavioral changes,67–70  such as a willingness to accept greater risk.71,72  For example, a clinician may become distracted while entering data into an electronic health record system and fail to notice an unrelated event such as surgical bleeding. A clinician may also become distracted while programming an infusion pump and inadvertently select an incorrect drug or infusion rate. This may become apparent only when there is a significant change in the patient’s vital signs. Managing a patient who has become unstable while troubleshooting the infusion pump may then lead to task saturation, further impairing the clinician’s ability to solve the underlying problem.

Discrete tasks, such as ventilating a patient at a specific rate and tidal volume, can be performed more efficiently by automated systems than by humans. When a machine performs a task, however, the feedback received by the user is different than when the user performs the task him- or herself.41,73  For example, if the clinician formerly stood next to the patient while providing care but now operates a system from a console that is remote from the patient (as happens during robotic surgery), the fundamental nature of the task has changed.74  The physical location is different and the signs and signals that the clinician receives also change. This can also occur when an anesthesia professional is caring for a patient in a remote location such as a magnetic resonance imaging or radiation therapy suite: The clinician has only the physiologic monitors to rely upon and has no direct access to the patient.

A clinician who is unaware of how a medical device can fail might place too much trust in the system and have difficulty assuming control if necessary. Providing information about what the device is doing and why will allow the clinician to understand how an automated system functions and to detect a degradation in the system’s performance more rapidly. Human–machine interactions can be improved by increasing the transparency of the system, in which the medical device provides information to the user about why it took a specific course of action.75  For example, an ICU ventilator that employs closed loop control might highlight the parameters that it is using to wean a patient. This would improve the clinician’s ability to resume manual control when a system failure occurs.76  This information should be solicited from the device manufacturer or may be gleaned through a careful review of the product manual. All clinicians should receive generalized training in how to manage automation, and before using medical devices in patient care, trainees and experienced physicians alike should receive instruction on how each device can fail and how to manage these malfunctions.

Automation Surprise and Mode Confusion

When an automated system malfunctions, the human operator, who may not have been monitoring the automated system closely, must quickly identify the failure, assess the situation, and resume manual control.38,77 Automation surprise occurs when a machine performs an action that the operator did not expect.78 Mode confusion occurs when the operator does not understand the automated system’s current state, either because of a lapse in supervision of the system or a poor human system interface.79  For example, a ventilator may be set to pressure support mode, but the operator believes that it is delivering synchronized intermittent mandatory ventilation. Mode confusion may prevent the clinician from fully understanding what the device is doing if a critical event occurs. For example, on some anesthesia machines, changing the respiratory rate without adjusting the inspiratory time will change the inspiratory:expiratory ratio. This may in turn cause an abnormal pressure or capnography waveform that the clinician may not expect if he or she is not familiar with this piece of equipment. The clinician’s workload has suddenly increased, possibly causing a startle response, which further impairs his or her performance while searching for the underlying cause.25  The resulting cacophony of alarms and alerts from the ventilator (alarm flood)49  may also increase the clinician’s confusion without helping him or her to better understand the etiology of the problem. Subtle differences in the user interface may also increase the risk of mode confusion. (fig. 1)

Fig. 1.

An example of how design can increase the risk for mode confusion. A clinician who is accustomed to using anesthesia machine A may inadvertently initiate the power off sequence while attempting to silence the alarms when using anesthesia machine B.

Fig. 1.

An example of how design can increase the risk for mode confusion. A clinician who is accustomed to using anesthesia machine A may inadvertently initiate the power off sequence while attempting to silence the alarms when using anesthesia machine B.

Close modal

Automation surprises and mode confusion may initially be difficult to detect and manage. Depending upon the nature of the malfunction, a human operator may have little time to intervene. The clinician may initially react to the failure with a startle response that may impair his or her ability to manage the problem.25  If the clinician is unable to determine the reason for the malfunction, the best solution may be to revert to the lowest level of automation possible (e.g., manual ventilation), disconnecting the device in question from the patient if possible. Training the clinician in the use of the equipment, its failure modes, and monitoring of the automation may decrease the risk of automation surprise.41,73,78,80–82  Training curricula should be revised to prepare clinicians for system failures by including unpredictability and device malfunctions in simulation training, and by teaching clinicians metacognitive skills.25,83  Such training would help clinicians to engage more effectively with device interfaces, better maintain situation and mode awareness, and restore the automated device to its intended clinical function. Human-systems integration research will facilitate the development of new displays that will help clinicians to understand what an automated medical device is doing and diagnose system failures more quickly.

Automation Bias

Physicians who rely too much on an automated system (e.g., a clinical decision support system) may develop automation bias, in which the user prioritizes suggestions from the automated system while disregarding contradictory information from other sources.6  This effect typically occurs when a clinician must accomplish multiple tasks and when manual tasks compete with the automated task for attention, as might happen in a busy clinic or while caring for a sick patient in the ICU. Automation bias may cause errors of commission, in which users implement incorrect recommendations, and omission, in which users fail to recognize a problem because they were not notified by the automated system.84  Automation bias may lead a clinician to over rely on alerts, prescribing medications only when they are suggested by the clinical decision support or computerized provider order entry. Clinicians may also accept automated recommendations for treatment even when they are incorrect. One study concluded that providers commit 58.8% fewer errors when provided with correct computer decision support and 86.6% more errors when this support is incorrect.85  Automation bias seems to be exacerbated by multitasking and by increasing cognitive load. Although no easy solutions currently exist, the best recommendations include reducing workload and distractions (perhaps by asking other personnel to perform noncritical clinical tasks). Device manufacturers and personnel responsible for designing electronic health record user interfaces can decrease the risk by presenting verification information with the clinical recommendation.86 

Boredom and Vigilance

The seal of the American Society of Anesthesiologists features the word “vigilance.” Much of anesthesia practice is a vigilance task during which the provider monitors vital signs and the surgical procedure in anticipation of a change in patient status.87  Clinicians must maintain vigilance for extended periods of time in order to detect relatively rare, but critical, events. Sustained attention for long periods of time causes cognitive fatigue; focusing attention on a monotonous task ultimately causes degradation of performance over time.88  In boring environments with a low task load, operators may find other tasks to help maintain some level of attention, and possibly to stay awake.89  Rest breaks and secondary tasks decrease monotony and improve vigilance when used correctly.90  Although vigilance tasks have been historically considered to be unstimulating and not mentally demanding, it is now understood that vigilance tasks produce a high level of subjective workload and cognitive stress.91  Warm et al have suggested that these factors should be considered when designing environments and tasks that require high levels of vigilance.92 

As a greater number of tasks become automated, clinicians may become bored and tempted to engage in ancillary activities while caring for a patient. Boredom in the workplace has been reported in occupations including unmanned aerial vehicle operation, train drivers, and commercial flight operations, and has been directly responsible for mishaps and near misses in aviation.93  Operators who function in an environment without manual tasks may experience mind wandering and complacency after a period as short as 20 min.94  Unfortunately, there is no simple solution to alleviating vigilance decrement and boredom.94  To combat boredom in the operating room, physicians may engage in activities unrelated to patient management such as viewing Web sites or engaging in activities on a smart phone.95  These tasks may have the beneficial effect of helping a clinician to maintain some level of attention or (especially late at night) simply to stay awake,89  but may be considered to be unprofessional. One potential solution may be to engage in an active scan (see Recommendations, below) and to remain engaged with the surgical team.

Research on automation spans multiple fields, including aviation, autonomous ground and sea vehicles, and medical robotics. The aviation industry has spent decades focusing on best practices for safety amid an influx of new automation in the cockpit. Given its broad experience with automation, the commercial aviation industry may offer some of the best examples of how to balance technology with safety. These experiences may also be applicable to the adoption of automation in healthcare.

One important lesson that can be drawn from aviation is that clinicians should receive training that will guide their interactions with automated medical devices. Airlines and corporate flight departments incorporate management of automation into their initial and recurrent training. Line-oriented flight training uses scenario-based training to address real-world problems that are likely to occur during flight.96  Each line-oriented flight training scenario forces the pilot to work through a specific problem with human–computer interaction, automation surprise, complacency, or situational awareness.97  Anesthesia professionals may likewise benefit from similar training that incorporates a challenge related to automation in the context of system management, teamwork, and decision making. Additional research can help to develop educational programs, possibly by creating scenarios in which clinicians must manage an automation failure or automation surprise while simultaneously treating an unrelated problem.

The aviation industry in the United States is controlled and monitored by the U.S. Federal Aviation Administration and U.S. National Transportation Safety Board, both of which can rapidly address new threats to safety. The U.S. Federal Aviation Administration has the authority to require compliance with regulations that affect automation in the cockpit. One example is the Traffic Collision Avoidance System. This device is located in each commercial airplane and is a last resort for avoiding a collision.47  If the Traffic Collision Avoidance System system detects that two airplanes are on a collision course, it immediately issues a resolution advisory (e.g., instructing one pilot to descend while simultaneously instructing the other pilot to climb). When this system was first introduced, the large number of false alarms caused pilots to ignore the alerts.98  The U.S. Federal Aviation Administration quickly mandated Traffic Collision Avoidance System use, however, requiring that pilots follow the resolution advisory. Although some false alarms still occur, the Traffic Collision Avoidance System is believed to be responsible for significant improvements in airspace safety.99  In health care, it would be reasonable to expect compliance with the recommendations of a similar decision support system if it had a high predictive value for a critical event. In order to achieve this, however, the system must be highly reliable and broadly implemented with adequate training. The challenge of developing a series of uniform standards in an industry with a patchwork of regulatory agencies is best illustrated by the complicated history of driverless ground vehicles.

Automatic anticollision systems are highly effective in driverless ground vehicles.100  While much of the public is not yet ready or willing to ride in completely autonomous vehicles,51,101  the technology will eventually change from automation that assists the driver to automation that replaces the driver,102  particularly after the public has become more aware of the potential safety improvements.103  Human drivers are currently required to monitor the automation and intervene when something goes wrong, with warnings and alerts that allow the driver to override the automation. The automobile industry has not been able to develop a uniform standard, however, and instead of the national regulations similar to those that govern aviation, many competing approaches to this problem confuse the driving public. This highlights the need to develop standards that apply across platforms and medical specialties, and to avoid the patchwork quilt of automation that typifies electronic health record implementation.

In health care, the International Electrotechnical Commission technical standard 62366 defines a process by which medical device manufacturers can evaluate the usability of a piece of equipment. Usability engineering, combined with information gleaned from adverse event databases, can help to improve patient safety by identifying user errors.104  This human performance evaluation allows the manufacturer to identify and mitigate risks associated with both correct and incorrect use of the device. Development of international standards, especially in the application of human factors to medical equipment design can potentially improve the safety of automated medical devices.105  These standards will help facilitate uniform adoption, as well as monitoring of automated systems so that lessons learned can hopefully be disseminated throughout the industry.

Education is the first step toward the safe use of automated medical technology. The World Federation of Societies of Anesthesia has recently published a position statement that highly recommends training in the use and safety of equipment and suggests formal certification and documentation of this training.106  The Anesthesia Patient Safety Foundation also recommends that clinicians should be formally trained to use new equipment and should be required to demonstrate that they can consistently use medical devices safely and effectively.9  Anesthesia professionals should also receive ongoing training as software is upgraded or new features are added. To mitigate the risks of automation failure, training should include both routine operation and management of system failures. Including unpredictable scenarios or introducing variability into a scenario may improve the ability of a trainee to manage unexpected problems.107  For example, one way to accomplish this might be to include a failure of a monitor or the anesthesia machine during a simulation of malignant hyperthermia. Simulation instructors can facilitate preparation for mode confusion, automation surprises, and malfunctions by adding unexpected equipment failure and automation surprises to their scenarios. Formal training on the use of advanced medical technology and automated devices will become increasingly important as it is added to the environment in which we work. This training should be provided, documented, and possibly required by healthcare institutions.

Clinicians can employ an active scan in order to maintain vigilance and better monitor medical devices. When performing an active scan, the clinician observes the indications on each medical device and activity in the operating room (e.g., the surgical field), moving in an orderly pattern from one to the next. (fig. 2) The information that each device displays is then crosschecked with information from other sources, which will also help the clinician to detect an artifact. For example, the heart rate derived from the electrocardiogram can be compared to the heart rate from the pulse oximeter. This is then compared to an arterial blood pressure tracing. A significant disparity might indicate that one monitor is malfunctioning, or that a physiologic change requires investigation (e.g., electrical activity on the electrocardiogram but no tracing on an arterial blood pressure waveform). Although the benefits of an active scan have not been studied in health care, variations in gaze patterns have been shown to affect the ability of airline pilots to control an airplane during approach and landing.108 

Fig. 2.

An active scan. The clinician starts by observing the surgical field (not shown) and then observes each device in the order suggested by the arrows. Information on each device is cross-checked with the others. (e.g., Do the vital signs on the anesthetic record correlate with those shown on the monitor?) The order in which the equipment is scanned can be varied depending upon the configuration of the anesthetizing location.

Fig. 2.

An active scan. The clinician starts by observing the surgical field (not shown) and then observes each device in the order suggested by the arrows. Information on each device is cross-checked with the others. (e.g., Do the vital signs on the anesthetic record correlate with those shown on the monitor?) The order in which the equipment is scanned can be varied depending upon the configuration of the anesthetizing location.

Close modal

Clinicians should be ready to take over control of any medical device if it fails or malfunctions, and should understand where the device is getting information from, how it is being used, and what will happen if that information is flawed. The clinician should carefully review the device’s settings as well as patient information (including physiologic parameters) to understand what the machine has been programed to do and how well it is performing its functions. Automation surprises, mode confusion, and automation failure can often best managed by reverting to the lowest level of automation possible.109  In the case of a ventilator or anesthesia machine that performs an unexpected action, for example, one should revert to manual ventilation. If necessary, the patient should be disconnected from the machine and ventilated with a self-inflating bag. If an infusion pump begins to deliver an incorrect dose of a medication, stop the pump. If necessary, disconnect the tubing from the patient’s infusion line. As medical devices become increasingly automated, manufacturers should include greater transparency in the design as well as effective methods of monitoring automated processes; this will allow practitioners to more seamlessly take over manual control. This can be done by explaining to the clinician where the device is getting its information, how trustworthy that information is, and how it is being used to make decisions.

Inattentional blindness may prevent even a trained observer from seeing something that is unexpected,110  preventing a clinician from detecting an incorrectly programed pump or ventilator. Clinicians should be properly trained to actively search for sources of error in automation. One example of risk mitigation is to examine multiple distinct data points that are associated with a given process to ensure that the programming is correct. In the case of a drug infusion, for example, the clinician can check the weight-based, programed infusion rate and compare that to the rate in milliliters per minute. An infusion that will take significantly less or more time than expected to complete may be a warning that the pump has been incorrectly programed. Although these steps may seem obvious, automation failures and surprises can be confusing and can rapidly progress to become a critical event, so the best time to think through potential problems is before they occur.

It may seem impossible for an individual clinician to change the way that medical technology is designed and marketed. Medical device manufacturers typically respond primarily to the needs of the global market when designing new equipment and not to the requests of individual clinician. Large group practices and health systems may, however, be able to push the market to include training or new safety features. The U.S. Veteran’s Health Administration has analyzed incident reports and device use histories. This information was then communicated to personnel responsible for purchasing new equipment with the explicit goal of “pushing” the market toward safer solutions.104 

Conclusions

It is likely that an increasing number of automated processes will be introduced into medical practice as technology continues to improve. Adopting these new systems safely requires that physicians and other healthcare leaders ensure that the unintended consequences of automation can be mitigated. Additional research into how automated medical devices can fail will facilitate improvements in design, use, and training. Much like the airline industry, clinicians should receive training in human–system interactions. This training curriculum should be incorporated into all aspects of medical education, including undergraduate medical education, residency, and continuing medical education. Topics should include vigilance, management of system failures, and maintaining manual skills. Leaders in simulation-based education should develop scenarios that integrate equipment malfunction. Clinicians should receive training in alarm management to minimize the number of false and misleading alarms to which they are exposed in order to prevent alarm fatigue.49  Research is urgently needed in how to keep clinicians engaged in patient care as an increasing number of tasks become automated. Finally, we recommend that professional societies develop guidelines to address these new requirements for training and implementation. These recommendations will help to ensure the safe, effective adoption of automated medical technology in the operating room and throughout the practice of medicine.

Acknowledgments

The authors wish to thank Anna Clebone Ruskin, M.D. (Assistant Professor of Anesthesia and Critical Care) and Michael F. O’Connor, M.D. (Professor of Anesthesia and Critical Care) at the University of Chicago (Chicago, Illinois) for their thoughtful review of the manuscript and insightful comments.

Research Support

Support was provided solely from institutional and/or departmental sources.

Competing Interests

Drs. Ruskin, Corvin, and Rice are partially supported by Federal Aviation Administration Cooperative Research Agreement 692M151940006: Air Traffic Organization Alarm Management. This funding did not support any of the work involved in the preparation of this manuscript. Dr. Winter declares no competing interests.

1.
Brogi
E
,
Cyr
S
,
Kazan
R
,
Giunta
F
,
Hemmerling
TM
.
Clinical performance and safety of closed-loop systems: A systematic teview and meta-analysis of randomized controlled trials.
Anesth Analg
.
2017
;
124
:
446
55
2.
Pasin
L
,
Nardelli
P
,
Pintaudi
M
,
Greco
M
,
Zambon
M
,
Cabrini
L
,
Zangrillo
A
.
Closed-loop delivery systems versus manually controlled administration of total IV anesthesia: A meta-analysis of randomized clinical trials.
Anesth Analg
.
2017
;
124
:
456
64
3.
Wang
X
,
Tao
Y
,
Tao
X
,
Chen
J
,
Jin
Y
,
Shan
Z
,
Tan
J
,
Cao
Q
,
Pan
T
.
An original design of remote robot-assisted intubation system.
Sci Rep
.
2018
;
8
:
13403
4.
Wang
S
,
Housden
J
,
Singh
D
,
Althoefer
K
,
Rhode
K
.
Design, testing and modelling of a novel robotic system for trans-oesophageal ultrasound.
Int J Med Robot
.
2016
;
12
:
342
54
5.
Ma
Q
,
Kobayashi
E
,
Wang
J
,
Hara
K
,
Suenaga
H
,
Sakuma
I
,
Masamune
K
.
Development and preliminary evaluation of an autonomous surgical system for oral and maxillofacial surgery.
Int J Med Robot
.
2019
;
15
:
e1997
6.
Parasuraman
R
,
Manzey
DH
.
Complacency and bias in human use of automation: An attentional integration.
Hum Factors
.
2010
;
52
:
381
410
7.
Hancock
PA
.
Some pitfalls in the promises of automated and autonomous vehicles.
Ergonomics
.
2019
;
62
:
479
95
8.
Assumptions Used in the Safety Assessment Process and the Effects of Multiple Alerts and Indications on Pilot Performance. National Transportation Safety Board
.
9.
Training Anesthesia Professionals to Use Advanced Medical Technology. Anesthesia Patient Safety Foundation
2013
.
10.
Chen
L
,
Ogundele
O
,
Clermont
G
,
Hravnak
M
,
Pinsky
MR
,
Dubrawski
AW
.
Dynamic and personalized risk forecast in step-down units. Implications for monitoring paradigms.
Ann Am Thorac Soc
.
2017
;
14
:
384
91
11.
Hendrickx
JFA
,
De Wolf
AM
.
The anesthesia workstation: Quo vadis?
Anesth Analg
.
2018
;
127
:
671
5
12.
Parasuraman
R
,
Riley
V
.
Humans and automation: Use, misuse, disuse, abuse.
Human Factors: The Journal of the Human Factors and Ergonomics Society
.
2016
;
39
:
230
53
13.
Arnal
JM
,
Wysocki
M
,
Novotni
D
,
Demory
D
,
Lopez
R
,
Donati
S
,
Granier
I
,
Corno
G
,
Durand-Gasselin
J
.
Safety and efficacy of a fully closed-loop control ventilation (IntelliVent-ASV®) in sedated ICU patients with acute respiratory failure: A prospective randomized crossover study.
Intensive Care Med
.
2012
;
38
:
781
7
14.
Bally
L
,
Thabit
H
,
Hartnell
S
,
Andereggen
E
,
Ruan
Y
,
Wilinska
ME
,
Evans
ML
,
Wertli
MM
,
Coll
AP
,
Stettler
C
,
Hovorka
R
.
Closed-loop insulin delivery for glycemic control in noncritical care.
N Engl J Med
.
2018
;
379
:
547
56
15.
Lee
HC
,
Ryu
HG
,
Chung
EJ
,
Jung
CW
.
Prediction of bispectral index during target-controlled infusion of propofol and remifentanil: A deep learning approach.
Anesthesiology
.
2018
;
128
:
492
501
16.
Joosten
A
,
Rinehart
J
,
Bardaji
A
,
Van der Linden
P
,
Jame
V
,
Van Obbergh
L
,
Alexander
B
,
Cannesson
M
,
Vacas
S
,
Liu
N
,
Slama
H
,
Barvais
L
.
Anesthetic management using multiple closed-loop systems and delayed neurocognitive recovery: A randomized controlled trial.
Anesthesiology
.
2020
;
132
:
253
66
17.
Annarumma
M
,
Withey
SJ
,
Bakewell
RJ
,
Pesce
E
,
Goh
V
,
Montana
G
.
Automated triaging of adult chest radiographs with deep artificial neural networks.
Radiology
.
2019
;
291
:
196
202
18.
McBride
SE
,
Rogers
WA
,
Fisk
AD
.
Understanding human management of automation errors.
Theor Issues Ergon Sci
.
2014
;
15
:
545
77
19.
Endsley
MR
.
From here to autonomy.
Hum Factors
.
2017
;
59
:
5
27
20.
Scerbo
MW
,
Mouloua
M
.
Automation technology and human performance: current research and trends
.
1999
,
Mahwah, N.J.
:
Lawrence Erlbaum
21.
Vagia
M
,
Transeth
AA
,
Fjerdingen
SA
.
A literature review on the levels of automation during the years. What are the different taxonomies that have been proposed?
Appl Ergon
.
2016
;
53 Pt A
:
190
202
22.
Wiener
EL
.
13 - Cockpit Automation. In: Wiener EL, Nagel DC, eds. Human Factors in Aviation
.
1988
,
San Diego
:
Academic Press
,
433
61
23.
Mouloua
M
,
Koonce
JM
.
Human-automation interaction: research and practice
.
1997
,
Mahwah, N.J.
:
Lawrence Erlbaum Associates
24.
Sarter
NB
,
Mumaw
RJ
,
Wickens
CD
.
Pilots’ monitoring strategies and performance on automated flight decks: An empirical study combining behavioral and eye-tracking data.
Human Factors: The Journal of the Human Factors and Ergonomics Society
.
2016
;
49
:
347
57
25.
Landman
A
,
Groen
EL
,
van Paassen
MMR
,
Bronkhorst
AW
,
Mulder
M
.
Dealing with unexpected events on the flight deck: A conceptual model of startle and surprise.
Hum Factors
.
2017
;
59
:
1161
72
26.
Oliver
N
,
Calvard
T
,
Potočnik
K
.
Cognition, technology, and organizational limits: Lessons from the Air France 447 disaster.
Organization Science
.
2017
;
28
:
729
43
27.
Prgomet
M
,
Li
L
,
Niazkhani
Z
,
Georgiou
A
,
Westbrook
JI
.
Impact of commercial computerized provider order entry (CPOE) and clinical decision support systems (CDSSs) on medication errors, length of stay, and mortality in intensive care units: A systematic review and meta-analysis.
J Am Med Inform Assoc
.
2017
;
24
:
413
22
28.
Vélez-Díaz-Pallarés
M
,
Álvarez Díaz
AM
,
Gramage Caro
T
,
Vicente Oliveros
N
,
Delgado-Silveira
E
,
Muñoz García
M
,
Cruz-Jentoft
AJ
,
Bermejo-Vicedo
T
.
Technology-induced errors associated with computerized provider order entry software for older patients.
Int J Clin Pharm
.
2017
;
39
:
729
42
29.
Goddard
K
,
Roudsari
A
,
Wyatt
JC
.
Automation bias: Empirical results assessing influencing factors.
Int J Med Inform
.
2014
;
83
:
368
75
30.
Endsley
MR
.
Level of automation effects on performance, situation awareness and workload in a dynamic control task.
Ergonomics
.
2010
;
42
:
462
92
31.
Carayon
P
,
Wetterneck
TB
,
Alyousef
B
,
Brown
RL
,
Cartmill
RS
,
McGuire
K
,
Hoonakker
PL
,
Slagle
J
,
Van Roy
KS
,
Walker
JM
,
Weinger
MB
,
Xie
A
,
Wood
KE
.
Impact of electronic health record technology on the work and workflow of physicians in the intensive care unit.
Int J Med Inform
.
2015
;
84
:
578
94
32.
Carayon
P
,
Wetterneck
TB
,
Cartmill
R
,
Blosky
MA
,
Brown
R
,
Hoonakker
P
,
Kim
R
,
Kukreja
S
,
Johnson
M
,
Paris
BL
,
Wood
KE
,
Walker
JM
.
Medication safety in two intensive care units of a community teaching hospital after electronic health record implementation: Sociotechnical and human factors engineering considerations.
J Patient Saf
.
2017
.
doi: 10.1097/PTS.0000000000000358
33.
Downing
NL
,
Rolnick
J
,
Poole
SF
,
Hall
E
,
Wessels
AJ
,
Heidenreich
P
,
Shieh
L
.
Electronic health record-based clinical decision support alert for severe sepsis: A randomised evaluation.
BMJ Qual Saf
.
2019
;
28
:
762
8
34.
Ancker
JS
,
Edwards
A
,
Nosal
S
,
Hauser
D
,
Mauer
E
,
Kaushal
R
;
with the HITEC Investigators
.
Effects of workload, work complexity, and repeated alerts on alert fatigue in a clinical decision support system.
BMC Med Inform Decis Mak
.
2017
;
17
:
36
35.
Lyell
D
,
Magrabi
F
,
Raban
MZ
,
Pont
LG
,
Baysari
MT
,
Day
RO
,
Coiera
E
.
Automation bias in electronic prescribing.
BMC Med Inform Decis Mak
.
2017
;
17
:
28
36.
Gregory
ME
,
Russo
E
,
Singh
H
.
Electronic health record alert-related workload as a predictor of burnout in primary care providers.
Appl Clin Inform
.
2017
;
8
:
686
97
37.
Kane-Gill
SL
,
O’Connor
MF
,
Rothschild
JM
,
Selby
NM
,
McLean
B
,
Bonafide
CP
,
Cvach
MM
,
Hu
X
,
Konkani
A
,
Pelter
MM
,
Winters
BD
.
Technologic distractions (part 1): Summary of approaches to manage alert quantity with intent to reduce alert fatigue and suggestions for alert fatigue metrics.
Crit Care Med
.
2017
;
45
:
1481
8
38.
Casner
SM
,
Geven
RW
,
Recker
MP
,
Schooler
JW
.
The retention of manual flying skills in the automated cockpit.
Hum Factors
.
2014
;
56
:
1506
16
39.
Endsley
MR
,
Kiris
EO
.
The out-of-the-loop performance problem and level of control in automation.
Human Factors: The Journal of the Human Factors and Ergonomics Society
.
2016
;
37
:
381
94
40.
Haslbeck
A
,
Hoermann
HJ
.
Flying the needles: Flight deck automation erodes fine-motor flying skills among airline pilots.
Hum Factors
.
2016
;
58
:
533
45
41.
Wickens
CD
,
Kessel
C
.
Failure detection in dynamic systems.
Human Detection and Diagnosis of System Failures
.
1981
155
69
42.
Dixon
SR
,
Wickens
CD
.
Automation reliability in unmanned aerial vehicle control: A reliance-compliance model of automation dependence in high workload.
Human Factors: The Journal of the Human Factors and Ergonomics Society
.
2016
;
48
:
474
86
43.
Singh
AL TT
,
Singh
IL
.
Effects of automation reliability and training on automation-induced complacency and perceived mental workload.
J Ind Acad Appl Psychol
.
2009
;
35
:
9
22
44.
Patel
M
.
Surgical dilemma: Loss of open surgical skills to minimally invasive surgery.
ANZ J Surg
.
2016
;
86
:
7
8
45.
Hillis
JM
,
Milligan
TA
.
Teaching the neurological examination in a rapidly evolving clinical climate.
Semin Neurol
.
2018
;
38
:
428
40
46.
Bingmer
K
,
Ofshteyn
A
,
Stein
SL
,
Marks
JM
,
Steinhagen
E
.
Decline of open surgical experience for general surgery residents.
Surg Endosc
.
2020
;
34
:
967
72
47.
Bliss
JP
.
Investigation of alarm-related accidents and incidents in aviation.
The International Journal of Aviation Psychology
.
2003
;
13
:
249
68
48.
AX001-1-2 Ueberlingen Report. Bundesstelle für Flugunfalluntersuchung (German Federal Bureau of Aircraft Accidents Investigation)
.
49.
Ruskin
KJ
,
Hueske-Kraus
D
.
Alarm fatigue: Impacts on patient safety.
Curr Opin Anaesthesiol
.
2015
;
28
:
685
90
50.
Winter
SR
,
Keebler
JR
,
Rice
S
,
Mehta
R
,
Baugh
BS
.
Patient perceptions on the use of driverless ambulances: An affective perspective.
Transportation Research Part F: Traffic Psychology and Behaviour
.
2018
;
58
:
431
41
51.
Rice
S
,
Winter
SR
.
Do gender and age affect willingness to ride in driverless vehicles: If so, then why?
Technology in Society
.
2019
58
52.
Rice
S
,
Winter
SR
.
Which passenger emotions mediate the relationship between type of pilot configuration and willingness to fly in commercial aviation?
Aviation Psychology and Applied Human Factors
.
2015
;
5
:
83
92
53.
Winter
SR
,
Rice
S
,
Mehta
R
,
Cremer
I
,
Reid
KM
,
Rosser
TG
,
Moore
JC
.
Indian and American consumer perceptions of cockpit configuration policy.
Journal of Air Transport Management
.
2015
;
42
:
226
31
54.
Wickens
CD
,
Dixon
SR
.
The benefits of imperfect diagnostic automation: A synthesis of the literature.
Theoretical Issues in Ergonomics Science
.
2007
;
8
:
201
12
55.
Rice
S
.
Examining single- and multiple-process theories of trust in automation.
J Gen Psychol
.
2009
;
136
:
303
19
56.
Wickens
CD
,
Rice
S
,
Keller
D
,
Hutchins
S
,
Hughes
J
,
Clayton
K
.
False alerts in air traffic control conflict alerting system: Is there a “cry wolf” effect?
Hum Factors
.
2009
;
51
:
446
62
57.
Rice
S
,
Geels
K
.
Using system-wide trust theory to make predictions about dependence on four diagnostic aids.
J Gen Psychol
.
2010
;
137
:
362
75
58.
Keller
D
,
Rice
S
.
System-wide versus component-specific trust using multiple aids.
J Gen Psychol
.
2010
;
137
:
114
28
59.
Parasuraman
R
,
Molloy
R
,
Singh
IL
.
Performance consequences of automation-induced ‘complacency’.
The International Journal of Aviation Psychology
.
1993
;
3
:
1
23
60.
Maltz
M
,
Shinar
D
.
New alternative methods of analyzing human behavior in cued target acquisition.
Human Factors: The Journal of the Human Factors and Ergonomics Society
.
2016
;
45
:
281
95
61.
Meyer
J
.
Effects of warning validity and proximity on responses to warnings.
Human Factors: The Journal of the Human Factors and Ergonomics Society
.
2016
;
43
:
563
72
62.
Meyer
J
.
Conceptual issues in the study of dynamic hazard warnings.
Human Factors: The Journal of the Human Factors and Ergonomics Society
.
2016
;
46
:
196
204
63.
Wickens
CD
,
Clegg
BA
,
Vieane
AZ
,
Sebok
AL
.
Complacency and automation bias in the use of imperfect automation.
Hum Factors
.
2015
;
57
:
728
39
64.
Hamilton Medical AG Recalls Hamilton-G5 Ventilators Due to Potential for Sporadic Error Message Resulting in the Ventilator to Cease Ventilation and Enter Ambient State
2019
.
65.
Bailey
NR
,
Scerbo
MW
.
Automation-induced complacency for monitoring highly reliable systems: The role of task complexity, system experience, and operator trust.
Theoretical Issues in Ergonomics Science
.
2007
;
8
:
321
48
66.
Casner
SM
,
Schooler
JW
.
Thoughts in flight: Automation use and pilots’ task-related and task-unrelated thought.
Hum Factors
.
2014
;
56
:
433
42
67.
Wickens
CD
,
Hollands
JG
,
Banbury
S
,
Parasuraman
R
.
Engineering psychology and human performance
.
2013
. Fourth edition,
Boston
:
Pearson
68.
Evans
L
.
Traffic safety and the driver
.
1991
,
New York, NY, US
:
Van Nostrand Reinhold Co
69.
Wilde
GJS
.
Risk homeostasis theory and traffic accidents: Propositions, deductions and discussion of dissension in recent reactions.
Ergonomics
.
1988
;
31
:
441
68
70.
Wilde
GJS
.
Accident countermeasures and behavioural compensation: The position of risk homeostasis theory.
Journal of Occupational Accidents
.
1989
;
10
:
267
92
71.
Sagberg
F
,
Fosser
S
,
Saetermo
IA
.
An investigation of behavioural adaptation to airbags and antilock brakes among taxi drivers.
Accid Anal Prev
.
1997
;
29
:
293
302
72.
Stanton
NA
,
Pinto
M
.
Behavioural compensation by drivers of a simulator when using a vision enhancement system.
Ergonomics
.
2000
;
43
:
1359
70
73.
McFadden
SM
,
Vimalachandran
A
,
Blackmore
E
.
Factors affecting performance on a target monitoring task employing an automatic tracker.
Ergonomics
.
2004
;
47
:
257
80
74.
Almeras
C
,
Almeras
C
.
Operating room communication in robotic surgery: Place, modalities and evolution of a safe system of interaction.
J Visc Surg
.
2019
;
156
:
397
403
75.
Cysneiros
LM
,
Raffi
M
,
Sampaio do Prado Leite
JC
.
Software transparency as a key requirement for self-driving cars.
2018
2018 IEEE 26th International Requirements Engineering Conference (RE)
382
7
76.
Kunze
A
,
Summerskill
SJ
,
Marshall
R
,
Filtness
AJ
.
Automation transparency: Implications of uncertainty communication for human-automation interaction and interfaces.
Ergonomics
.
2019
;
62
:
345
60
77.
Mumaw
RJ
.
Analysis of alerting system failures in commercial aviation accidents.
2017
;
61
Proceedings of the Human Factors and Ergonomics Society Annual Meeting
110
4
78.
Sarter
NB WD
,
Billings
CE
.
G
S
.
Automation Surprises.
Handbook of Human Factors and Ergonomics
.
1997
. 2nd editon,
New York, NY
:
Wiley
,
1926
43
79.
Maeda
Y
,
Ushio
T
.
Detection of mode confusion in human-machine system model with temporal information on operations.
IFAC-PapersOnLine
.
2017
;
50
:
9374
9
80.
De Boer
R
,
Dekker
S
.
Models of automation surprise: results of a field survey in aviation.
Safety
.
2017
3
81.
Parasuraman
R
,
Mouloua
M
.
Automation and human performance: theory and applications
.
1996
,
Mahwah, N.J.
:
Lawrence Erlbaum Associates
82.
Endsley
MR
.
Level of automation forms a key aspect of autonomy design.
Journal of Cognitive Engineering and Decision Making
.
2017
;
12
:
29
34
83.
Stiegler
MP
,
Tung
A
.
Cognitive processes in anesthesiology decision making.
Anesthesiology
.
2014
;
120
:
204
17
84.
Sauer
J
,
Chavaillaz
A
,
Wastell
D
.
Experience of automation failures in training: effects on trust, automation bias, complacency and performance.
Ergonomics
.
2016
;
59
:
767
80
85.
Lyell
D
,
Magrabi
F
,
Raban
MZ
,
Pont
LG
,
Baysari
MT
,
Day
RO
,
Coiera
E
.
Automation bias in electronic prescribing.
BMC Med Inform Decis Mak
.
2017
;
17
:
28
86.
Lyell
D
,
Coiera
E
.
Automation bias and verification complexity: A systematic review.
Journal of the American Medical Informatics Association
.
2016
87.
Weinger
MB
.
Vigilance, boredom, and sleepiness.
J Clin Monit Comput
.
1999
;
15
:
549
52
88.
Gartenberg
D
,
Gunzelmann
G
,
Hassanzadeh-Behbaha
S
,
Trafton
JG
.
Examining the role of task requirements in the magnitude of the vigilance decrement.
Front Psychol
.
2018
;
9
:
1504
89.
Cummings
ML
,
Gao
F
,
Thornburg
KM
.
Boredom in the workplace: A new look at an old problem.
Hum Factors
.
2016
;
58
:
279
300
90.
Ralph
BC
,
Onderwater
K
,
Thomson
DR
,
Smilek
D
.
Disrupting monotony while increasing demand: Benefits of rest and intervening tasks on vigilance.
Psychol Res
.
2017
;
81
:
432
44
91.
Dillard
MB
,
Warm
JS
,
Funke
GJ
,
Nelson
WT
,
Finomore
VS
,
McClernon
CK
,
Eggemeier
FT
,
Tripp
LD
,
Funke
ME
.
Vigilance tasks: Unpleasant, mentally demanding, and stressful even when time flies.
Human Factors: The Journal of the Human Factors and Ergonomics Society
.
2018
;
61
:
225
42
92.
Warm
JS
,
Parasuraman
R
,
Matthews
G
.
Vigilance requires hard mental work and is stressful.
Hum Factors
.
2008
;
50
:
433
41
93.
Aviation Incident Final Report. National Transportation Safety Board
.
94.
Gouraud
J
,
Delorme
A
,
Berberian
B
.
Influence of automation on mind wandering frequency in sustained attention.
Conscious Cogn
.
2018
;
66
:
54
64
95.
Wax
DB
,
Lin
HM
,
Reich
DL
.
Intraoperative non-record-keeping usage of anesthesia information management system workstations and associated hemodynamic variability and aberrancies.
Anesthesiology
.
2012
;
117
:
1184
9
96.
Hamman
WR
.
Line oriented flight training (LOFT).
Crew Resource Management
.
2010
233
63
97.
Wise
JA
,
Hopkin
VD
,
Garland
DJ
.
Handbook of Aviation Human Factors
.
2016
98.
Dixon
SR
,
Wickens
CD
,
McCarley
JS
.
On the independence of compliance and reliance: Are automation false alarms worse than misses?
Human Factors: The Journal of the Human Factors and Ergonomics Society
.
2016
;
49
:
564
72
99.
Kuchar
JK DA
.
The traffic alert and collision avoidance system.
Lincoln Laboratory Journal
.
2007
;
16
:
277
96
100.
Parasuraman
R
,
Hancock
PA
,
Olofinboba
O
.
Alarm effectiveness in driver-centred collision-warning systems.
Ergonomics
.
1997
;
40
:
390
9
101.
Rice
S
,
Winter
SR
,
Mehta
R
,
Ragbir
NK
.
What factors predict the type of person who is willing to fly in an autonomous commercial airplane?
Journal of Air Transport Management
.
2019
;
75
:
131
8
102.
Strand
N
,
Nilsson
J
,
Karlsson
ICM
,
Nilsson
L
.
Semi-automated versus highly automated driving in critical situations caused by automation failures.
Transportation Research Part F: Traffic Psychology and Behaviour
.
2014
;
27
:
218
28
103.
Anania
EC
,
Rice
S
,
Walters
NW
,
Pierce
M
,
Winter
SR
,
Milner
MN
.
The effects of positive and negative information on consumers’ willingness to ride in a driverless vehicle.
Transport Policy
.
2018
;
72
:
218
24
104.
Bagian
TM
,
Jacobs
K
,
Lightner
NJ
.
Purchasing for safety: Beginning a conversation with the medical device industry.
Procedia Manufacturing
.
2015
;
3
:
264
8
105.
Privitera
MB
,
Evans
M
,
Southee
D
.
Human factors in the design of medical devices - Approaches to meeting international standards in the European Union and USA.
Appl Ergon
.
2017
;
59
Pt A
251
63
106.
Gelb
AW
,
Morriss
WW
,
Johnson
W
,
Merry
AF
;
International Standards for a Safe Practice of Anesthesia Workgroup
.
World Health Organization-World Federation of Societies of Anaesthesiologists (WHO-WFSA) International Standards for a Safe Practice of Anesthesia.
Can J Anaesth
.
2018
;
65
:
698
708
107.
Landman
A
,
van Oorschot
P
,
van Paassen
MMR
,
Groen
EL
,
Bronkhorst
AW
,
Mulder
M
.
Training pilots for unexpected events: A simulator study on the advantage of unpredictable and variable scenarios.
Hum Factors
.
2018
;
60
:
793
805
108.
Haslbeck
A
,
Zhang
B
.
I spy with my little eye: Analysis of airline pilots’ gaze patterns in a manual instrument flight scenario.
Appl Ergon
.
2017
;
63
:
62
71
109.
Parasuraman
R
,
Sheridan
TB
,
Wickens
CD
.
A model for types and levels of human interaction with automation.
IEEE Trans Syst Man Cybern A Syst Hum
.
2000
;
30
:
286
97
110.
Drew
T
,
ML
,
Wolfe
JM
.
The invisible gorilla strikes again: Sustained inattentional blindness in expert observers.
Psychol Sci
.
2013
;
24
:
1848
53