Review of unusual patient care experiences is a cornerstone of medical education. Each month, the AQI-AIRS Steering Committee abstracts a patient history submitted to the Anesthesia Incident Reporting System (AIRS) and authors a discussion of the safety and human factors challenges involved. Real-life case histories often include multiple clinical decisions, only some of which can be discussed in the space available. Absence of commentary should not be construed as agreement with the clinical decisions described. Feedback regarding this article can be sent by email to Report incidents or download the AIRS mobile app at

It was day three of six for me and day three with very good FO (First Officer). Well rested, great rapport and above average Crew coordination. Knew we had a MAX. It was my leg, normal Ops Brief, plus I briefed our concerns with the MAX issues, bulletin, MCAS, stab trim cutout response, etc. I mentioned I would engage autopilot sooner than usual (I generally hand fly to at least above 10,000 ft.) to remove the possible MCAS threat.

Weather was about 1000 OVC drizzle, temperature dropping and an occasional snow flake. I double-checked with an additional personal walkaround just prior to push; a few drops of water on the aircraft but clean aircraft, no de-ice required. Departure was normal. Takeoff and climb in light to moderate turbulence. After flaps 1 to “up” I looked at and engaged A Autopilot. As I was returning to my PFD (Primary Flight Display) PM (Pilot Monitoring) called “DESCENDING” followed by almost an immediate: “DON’T SINK, DON’T SINK!”

I immediately disconnected AP (Autopilot) and resumed climb. Now, I would generally assume it was my automation error, i.e., aircraft was trying to acquire a miss-commanded speed/no autothrottles, crossing restriction, etc., but frankly neither of us could find an inappropriate setup error (not to say there wasn’t one).

With the concerns with the MAX 8 nose down stuff, we both thought it appropriate to bring it to your attention. We discussed issue at length over the course of the return to ZZZ. Best guess from me is airspeed fluctuation due to mechanical shear/frontal passage that overwhelmed automation temporarily or something incorrectly setup in MCP (Mode Control Panel). PM’s callout on “descending” was particularly quick and welcome as I was just coming back to my display after looking away. System and procedures coupled with CRM (Resource Management) trapped and mitigated issue.

– November 2018 Aviation Safety Reporting System

Anesthesiology leads the field of medicine in patient safety, and the appropriate application of technology has been a significant part of reducing patient harm. Anyone practicing more than 10-15 years will remember a time when there was a manually programmed ventilator, a basic patient monitor and a piece of paper with a grid for recording data. Some readers may no longer recognize the paper. In a recent complex case, I counted over 10 displays and computers running behind the drapes, ranging from traditional EHRs, the anesthesia machine, monitors, drug dispensing systems and pumps. All of these have varying degrees of alarms and monitoring software running at any given time. Some of these devices can even suggest a course of action based on data received. The EHR/automated anesthesia record may respond to specific data by advising the clinician to administer medications, manage blood pressure or take other action. Some ventilators can now call out when tidal volume is greater than expected on a cc/kg basis and agent/fresh gas flows can be suggested to minimize agent usage and environmental impact.

Our specialty is now cautiously exploring having our machines intervene in very well-defined situations. There are ventilator modes that will drive pressures to achieve a set tidal volume and automatically cut the pressure if the system senses a compliance change. Target-controlled infusion is used in countries other than the U.S. and may be coming here soon. One EHR vendor’s software can wirelessly program a syringe pump. While all of these examples have strong cases for patient safety, there will be unintended consequences – even with comprehensive testing and FDA approval.

Boeing entered this arena with the Boeing 737 MAX. When they were testing this new aircraft, they noted a tendency for the nose to pitch up under certain conditions. The solution was to implement a new sensor and software that detected these undesired excursions and to then push the nose down and restore a safe flying attitude. This concept was logical, but the unintended consequence occurred when the computer received erroneous information from the new sensor. It pushed the nose down when such an action was not appropriate or desirable. The latest analysis at the time of writing this article is that problems with this system may have contributed to two crashes with enormous loss of life.

The parallels to our specialty are striking. As we inevitably move forward with using technology to improve safety and outcomes, there will be a desire to add more automation to anesthesia care delivery. In many cases, this may be the correct option. As any director of quality will note, human error is ubiquitous in medicine. However, we also have a duty to protect our patients from the unintended consequences of this effort. It should give us pause that even in aviation, one of the most regulated industries on the planet, with a well-earned reputation as a highly reliable industry, this one slipped through testing. We expect it to happen again, and certainly in the field of medicine, as no system of testing will ever completely replicate real-world experience.

Target-controlled infusion is used in countries other than the U.S. and may be coming here soon. One EHR vendor’s software can wirelessly program a syringe pump. While all of these examples have strong cases for patient safety, there will be unintended consequences – even with comprehensive testing and FDA approval.

When AIRS was formed, the non-anesthesia reporting system we researched the most was the Aviation Safety Reporting System (ASRS)  Established in 1976 by a memorandum of understanding between the FAA and NASA, this system is designed as a non-punitive way for pilots and other aviation professionals to report near-misses and aviation systems problems that may be precursor events for future cases of harm. ASRS has some key distinctions from AIRS. First, by entering a report for an incident that did not result in an accident involving harm to passengers or crew, or damage to the aircraft, the reporter of the incident is granted immunity for any rule violations that may have occurred during the event. This is a powerful incentive to report. Second, the system is administered by a neutral third party – in this case, NASA, who takes responsibility for de-identifying the information and analyzing it for patterns and safety. A third difference is NASA posts the actual reports; by redacting key information, they are able to preserve confidentiality. Would it surprise you to know that a search of the ASRS database revealed six incidents involving the 737 MAX aircraft?2  Two of these reports were related to actual circumstances where the plane’s software intervened, and the others were related to the lack of training on the new “feature.”

Frequent readers of this column are aware that the AIRS database has many reports of equipment issues, and the committee looks for trends to identify where a device or procedure may have contributed to harm. One of the first articles from AIRS data highlighted the risks of an air embolism with ERCP, as there were three cases noted in the database. As technology increases in the perioperative realm, we need a way to be able to detect when testing did not find all the ways where the new feature can actually have unintended consequences that put our patients at risk. Our reporting system is capable of detecting these circumstances and aggregating the data across the U.S. That said, our ability to do this is only as strong as the information we receive.

One of the committee’s goals for the coming years is to increase reporting of adverse events in anesthesiology. Voluntary event reporting systems typically capture less than 1 percent of actual events. We are working to better integrate reporting into the workflows that our anesthesiologists use every day, as well as partnering with large anesthesia practices to pull their adverse events into the overall database and increase that percentage to one high enough to quickly and reliably identify opportunities for improvement in our specialty, and when a device or procedure is contributing to patient harm.

Our recommendation is simple: Please use your group’s or hospital’s reporting systems anytime you have an adverse event or near-miss and consider reporting any issues you feel could impact our anesthesia community to AIRS. Maybe we can avert our 737 MAX before it happens.

Special Thanks to Randall M. Clark, M.D., ASA Director for Colorado and a licensed private pilot, for his expertise and contributions to this article.

Anesthesia Safety Reporting System website
. Last accessed April 15, 2019
Here’s what was on the record about problems with the 737 MAX. The Atlantic
. March 13,
. .