Pulse oximeters have been reported to fail to record data in 1.12-2.50% of cases in which anesthesia records were handwritten. There is reason to believe that these may be underestimates. Computerized anesthesia records may provide insight into the true incidence of pulse oximetry data failures and factors that are associated with such failures.
The current study reviewed case files of 9,203 computerized anesthesia records. Pulse oximetry data failure was defined as the presence of at least one continuous gap in data > or = 10 min in duration in a case. A multivariate logistic regression model was used to identify predictors of pulse oximetry data failure, and a modified case-control method was used to determine whether extremes of blood pressure and hypothermia during the procedure were associated with pulse oximetry data failure.
The overall incidence of cases that had at least one continuous gap of > or = 10 min in pulse oximetry data was 9.18%. The independent preoperative predictors of pulse oximetry data failure were ASA physical status 3,4, or 5 and orthopedic, vascular, and cardiac surgery. Intraoperative hypothermia, hypotension, hypertension, and duration of procedure were also independent risk factors for pulse oximetry data failure.
Pulse oximetry data failure rates based on review of computerized records were markedly greater than those previously reported. Physical status, type of surgery, and intraoperative variables were risk factors for pulse oximetry data failure. Regulations and expectations regarding pulse oximetry monitoring should reflect the limitations of the technology.
Key words: Complications, intraoperative: hypertension; hypotension; hypothermia; hypoxemia. Computers: medical records. Monitoring, intraoperative: oxygenation; pulse oximetry; retrospective studies.
PULSE oximetry has been adopted readily as an intraoperative monitoring standard. Any anesthesia record from which pulse oximetry data are missing is therefore problematic. The published incidence of pulse oximetry data failure in two large studies was 1.12 and 2.5%, [1,2]but these studies were based on review of handwritten anesthesia records and human observers, respectively, The subjective nature of these forms of data collection may produce inaccurate estimates of the true incidence of pulse oximetry data failures.
At the Mount Sinai Medical Center, a state inspection in March 1994 revealed pulse oximetry data were missing in clinically significant portions of approximately 20% of computerized anesthesia records. Approximately 60% of the missing pulse oximetry data was attributable to SpO2values that were derived from low-amplitude plethysmography signals. These values were being automatically rejected by the computer record-keeping software as invalid. The remainder of the gaps in pulse oximetry data were apparently related to plethysmography signals that were absent or too weak to be interpreted.
Pulse oximetry data failures place additional burdens on the anesthesiologist from both regulatory agencies and from legal liability in the event of a poor outcome. On the basis of the objective and comprehensive nature of computerized anesthesia records, the authors sought, in a retrospective study, to reexamine the incidence of and factors predisposing to pulse oximetry data failure.
The project was approved by the Institutional Review Board. To ensure confidentiality, none of the data obtained from queries of the medical records was identifiable to patient source. The data for this study came from computerized anesthesia record-keeping systems (CompuRecord, Anesthesia Recording, Pittsburgh, PA) that had been installed in 22 operating rooms over a 2-yr period (1991-93). The computerized anesthesia records were stored permanently on optical media. All computerized records collected between April 1994 and April 1995 were subjected to analysis, with the following exceptions: corrupted computer files (0.09%), training sessions on the computer systems, and the records of all patients who were younger than 13 yr. All of the anesthesia records analyzed in this study included pulse oximetry data derived from normal and low-amplitude plethysmography signals.
Hemodynamic, respiratory, and other patient monitoring data were recorded every 15 s for each case. Pulse oximetry data were derived from the Hewlett-Packard Component Monitoring System (Waltham, MA). A computer program in the C language was written to extract information from the CompuRecord files and place it in relational database tables. A pulse oximetry data failure was defined as a greater or equal to 10 min continuous gap in pulse oximetry data (greater or equal to 40 missing data points at 15-s intervals) that began at least 15 min after the start of the case (to account for the period of insertion of intravenous cannulae and initiation of monitoring) and that did not occur during cardiopulmonary bypass. According to this definition, a case either had or did not have at least one pulse oximetry data failure (binary variable). A 10-min minimum gap was chosen arbitrarily as the criterion for a clinically significant pulse oximetry data failure based on the likelihood that 10 min of continuous actual hypoxemia would result in an adverse outcome. For the purpose of comparing the results from computerized record data with those of previous studies, [1,2]the incidences of pulse oximetry data failures greater or equal to 15 min and greater or equal to 30 min in duration were determined. The analysis is based on 9,203 computerized anesthesia records generated between April 1994 and April 1995 when SpO2data derived from low amplitude signals were considered valid.
To ensure the quality of the computer program that we wrote to determine pulse oximetry data failure, one of the investigators (MD) reviewed (in a blinded fashion) 20 computerized anesthesia records for gaps in pulse oximetry data. This review confirmed that the computer program had correctly identified all gaps in SpO2data greater or equal to 10 min in duration that were detected by the investigator. In addition, the computer program correctly identified one pulse oximetry data failure that had been missed by the investigator.
One issue in determining whether an intraoperative variable is a cause of pulse oximetry data failure is that a problem (e.g., hypothermia) may occur after the pulse oximetry data failure and be misconstrued as being the cause of the failure. In other words, if only summary parameters, such as minimum temperature for the entire case, were compared with the temperature at the time of failure, an erroneous conclusion might be drawn if the minimum temperature for the case occurred after the failure. To establish whether a causal relationship existed between temperature or blood pressure and pulse oximetry data failure, a modified case-control methodology was applied. The esophageal temperature or mean arterial pressure at the time of the first pulse oximetry data failure was identified in a case. A second case serving as a control (with a similar age, procedure type, and ASA physical status), which had not yet experienced a failure, was selected at random from the database of computerized records. The temperature or mean arterial pressure from the failure cases and the controls were compared at the time when the failure occurred (cases) and at an identical time in the procedure (controls). A 2 x 2 contingency table was created by classifying each case and each control temperature according to whether it was > 34 degrees C or less or equal to 34 degrees C. Statistical significance was determined by estimation of the odds ratio from matched pairs. To analyze mean arterial pressures, a 3 x 3 contingency table was created by classifying each case and each control mean arterial pressure according to whether it was < 70 mmHg, 70-110 mmHg, or > 110 mmHg. Conditional logistic regression was used to estimate the odds of pulse oximetry data failure during hypotension and hypertension and to test for their statistical significance.
A multivariate logistic regression model was used to ascertain statistically significant independent predictors of pulse oximetry data failure based on the preoperative parameters of age, ASA physical status, and surgery classification. Another logistic regression analysis was performed to determine whether duration of surgery was an independent predictor of pulse oximetry data failure. All analyses were two-tailed, and P < 0.05 was considered significant.
The proportion of cases with at least one continuous pulse oximetry data failure less or equal to 10 min in duration was 9.18%. The mean duration of all failures was 17 min, and the mean duration of anesthesia was 213 min. Table 1shows the distribution of pulse oximetry data failures by type of surgery.
The cumulative time of missing pulse oximetry data (of any duration > 15 s) as a percentage of the duration of the entire case was calculated. Seventy-four percent of cases had no such gaps in SpO2data. Ten percent of the cases had pulse oximetry data failure greater or equal to 1.41% of the entire case. Five percent of the cases had pulse oximetry data failure greater or equal to 3.45% of the entire case. One percent of the cases had pulse oximetry data failure greater or equal to 19.48% of the entire case.
The analysis of the influence of intraoperative hypothermia (esophageal temperature < 34 degrees C) on pulse oximetry data failure was based on a modified (nested) case-control comparison. There were 1,142 cases with at least one pulse oximetry data failure paired with control cases with matching demographics in which a pulse oximetry data failure had not yet occurred. There were more instances in which hypothermia in the failure cases was paired with normothermia in the controls (196 instances) versus the opposite scenario (131 instances). The estimated odds ratio for a pulse oximetry data failure due to hypothermia was 1.5 (95% confidence interval 1.2-1.9, P < 0.001). This indicates that hypothermia (esophageal temperature < 34 degrees C) is a significant risk factor for pulse oximetry data failure.
The extremes of intraoperative blood pressure were defined as mean arterial pressure < 70 mmHg (hypotension) and mean arterial pressure > 110 mmHg (hypertension). The analysis of the influence of these extremes on pulse oximetry data failure used a similar modified (nested) case-control comparison. There were 1,498 cases with at least one pulse oximetry data failure paired with control cases with matching demographics in which a pulse oximetry data failure had not yet occurred. Conditional logistic regression was used to estimate the odds of pulse oximetry data failure during hypotension (estimated odds ratio 2.9, 95% confidence limits 2.2-3.7, P < 0.001) and hypertension (odds ratio 1.9, 95% confidence limits 1.6-2.4, P < 0.001). This indicates that hypertension and hypotension were both risk factors for pulse oximetry data failure.
The results of the multivariate logistic regression used to analyze preoperative parameters are shown in Table 2. Pulse oximetry data failures were statistically significantly more common in patients who were ASA physical status 3, 4, or 5 and in those undergoing orthopedic, vascular, and cardiac surgery. Age 51-70 yr, age 71-99 yr, and abdominal surgery were not associated with significantly higher rates of pulse oximetry data failure. To determine whether duration of the procedure had an indirect influence on these findings, the logistic regression analysis was repeated separately for cases lasting < 4 h, 4-6 h, and 6-12 h, respectively. The results were substantially the same in each, except that abdominal surgery replaced cardiac surgery as a significant predictor for failure in the short-duration (< 4 h) cases, and age 51-70 yr was associated with a significantly lower risk in this group. The only other change was the absence of an association between failure and vascular surgery in the cases lasting 6-12 h.
The preceding analysis indicated that certain preoperative factors influenced the likelihood of at least one pulse oximetry data failure, regardless of the duration of the case. It is also of interest to determine whether the likelihood of a pulse oximetry data failure increases in cases of longer duration independently of preoperative factors. To exclude the possibility that hypothermia might be a confounding factor in cases of longer duration, the 915 cases in which hypothermia (< 34 degrees C) occurred were excluded from the following analysis. The original multivariate analysis was repeated on the reduced group of 8,288 cases, without and with the addition of a variable measuring case duration longer than 2 h. The results for the preoperative factors were virtually the same as shown in Table 2, and the variable for case duration was found to be an independent predictor of pulse oximetry data failure (P < 0.001); the estimated odds ratio of a failure increased by 1.3 for each hour that the case extended beyond 2 h.
Anesthesia records could contain comments regarding the cause of a pulse oximetry data failure, and a record could have included more than one comment if there was more than one failure. The incidence of these comments for each individual occurrence of a failure was 17%. Arterial blood gas analysis results were recorded within 30 min of a pulse oximetry data failure in 21% of cases.
The incidence of pulse oximetry data failure decreases as the criterion for defining pulse oximetry data failure is increased from 10 through 30 min. The incidence of at least one continuous pulse oximetry data failure greater or equal to 10 min was 9.2%, greater or equal to 15 min in duration was 6.6%, and greater or equal to 30 min in duration was 2.0%.
In this retrospective study, there was no identifiable incident in which the pulse oximeter instrument failed. However, thousands of computerized anesthesia records documented that the pulse oximeter instrument failed to provide data for certain periods of time.
Over the last decade, pulse oximetry has become a standard for basic intraoperative monitoring. This trend has been criticized by Keats based on the lack of objective outcome data. Pulse oximetry has been accepted overwhelmingly on empirical grounds, however, because it affords a decrease in professional liability insurance premiums and conforms with the national trend toward the adoption of practice parameters. The Harvard Anesthesia Monitoring Standards published in 1986 included pulse oximetry as one possible monitor of circulation with the stipulation that "brief interruptions of the continuous monitoring may be unavoidable." In 1989-90, the Departments of Health of the states of New York and New Jersey adopted modifications of the Harvard Standards as part of their State Health Codes.*,** The New York regulation requires that the patient's oxygenation be monitored "continuously" with pulse oximetry or superior technology. Thus, in New York, it would be a violation of state health regulations to conduct general anesthesia without pulse oximetry or continuous intraarterial hemoglobin saturation monitoring (such as by an optode), and there is no provision written in the health code for pulse oximetry data failure.
The Harvard Anesthesia Monitoring Standards also do not address the possibility of pulse oximetry data failure. The ASA Standards for Basic Intraoperative Monitoring*** state, "Under extenuating circumstances, the responsible anesthesiologist may waive the [monitoring of oxygenation] . . . requirement" but recommend, "that when this is done, it should be so stated (including the reasons) in a note in the patient's medical record."
A study of pulse oximetry data failure at four University of Washington Hospitals reported that the incidence of failures > 30 min in duration ranged from 0.56% to 4.24% (mean 1.12%) of 11,046 procedures. The risk of failure was increased in association with ASA physical status 3, 4, or 5, advanced age (50-60 yr), and prolonged surgery. A large prospective study from Denmark reported a mean pulse oximetry data failure rate of 2.5% and as high as 7.2% in ASA physical status 4 patients. In the current study, the proportion of computerized anesthesia records with pulse oximetry data failures was considerably higher (9.18%).
Much of the difference in pulse oximetry data failure rates among the studies may be related to the definition of a pulse oximetry data failure. For example, a subset of our data (April 1994-September 1994) yielded a failure rate of 2.2% when failure was defined as an absence of pulse oximetry data for at least one continuous interval greater or equal to 30 min. The failure rate was 2.9% when a cumulative period greater or equal to 30 min was used to define a failure. In the University of Washington Hospitals study, failure (1.12% overall) was defined as an absence of pulse oximetry data on handwritten records for a cumulative period of greater or equal to 30 min. It is the authors' opinion that a continuous or cumulative 30-min hiatus is an extremely long interval, during which continuous monitoring is mandated, and the potential for cerebral injury is present after much shorter periods of actual hypoxemia. In the study from Denmark, failure was defined as the temporary (1.6%) or complete (0.4%) abandonment of oximeter use in the operating room, and no durations were specified. The overall 2.5% failure incidence reported in that study includes failures in the operating room and the postanesthesia care unit.
The predictors of pulse oximetry data failure identified in the current study are ASA physical status 3, 4, and 5, type of surgical procedure, hypothermia, hypotension, hypertension, and duration of procedure. The current investigation did not address the incidence of pulse oximetry data failures in patients younger than 13 yr. The incidence of failure, however, was not found to be higher in older patients (51-70 or 71-99 yr). Our findings, that patients with more severe systemic disease (ASA physical status 3, 4, and 5) and those undergoing longer procedures had higher pulse oximetry data failure rates, are in agreement with those of others. .
Hypotension and hypothermia are both associated with poor peripheral perfusion and have been shown previously to be etiologies of pulse oximetry data failure in patients undergoing cardiopulmonary bypass. Furthermore, in a study of awake individuals whose limbs were exposed to cold, there were significant reductions in finger arterial pressure and plethysmographic pulse wave amplitude in both normal volunteers and patients with Raynaud's phenomenon. The association between intraoperative hypertension and pulse oximetry data failure is, to the authors' knowledge, a new finding. The authors can only speculate that this may represent a vasoconstrictive phenomenon; a literature search on the subject did not yield any pertinent material.
Pulse oximetry is dependent on the adequacy of the peripheral circulation and the limitations of the present technology. The underlying physiologic factors that may lead to the pulse oximetry data failures detected in this study include diminished peripheral blood flow due to sympathetic nervous system-induced vasoconstriction, hypovolemia, hypothermia, and vascular reactivity. Other technical factors include the algorithms used by the oximeter's software and displacement of the probe. Paradoxically, it is the subset of patients in whom pulse oximetry data failure is most likely who would probably benefit most from continuous monitoring of arterial hemoglobin saturation.
The current study has several limitations. The retrospective design did not permit the authors to identify what efforts were made by individual anesthesiologists to restore pulse oximetry recordings. There is also no way to determine whether the data collected might have been inaccurate because of low signal amplitude, administration of intravenous dyes, or dyshemoglobinemias. The type of pulse oximeter probes used (various disposable models vs. reusable ones) and the site of probe application could not be determined in this study. In addition, the results of this study are not applicable to cases of very short duration or to pediatric patients.
The New York State Health Code regulations stipulating that continuous monitoring of oxygenation is required are problematic given the high incidence of pulse oximetry data failure. The available alternatives to pulse oximetry are generally not practical and probably not cost-effective. Miniaturized Clark (oxygen) electrodes have been developed to measure oxygen tension noninvasively through the skin or the conjunctiva but are rarely used clinically. Intraarterial oxygen optical fluorescence quenching (optode) probes are commercially available but are expensive (approximately $300 per single-use probe). The cost and risk of intraarterial probes for continuous monitoring in approximately 9% of all patients undergoing anesthesia are not justified by any data.
Anesthesiologists should be aware of the association of demographic risk factors and of intraoperative hypothermia, hypotension, and hypertension with pulse oximetry data failure. When pulse oximetry data are absent, other parameters should be used to confirm that the patient is adequately oxygenated during anesthesia. In the current regulatory and medicolegal environment, the practitioner should consider documenting that more invasive methods of arterial hemoglobin saturation monitoring are not clinically indicated.
In conclusion, the proportion of anesthesia records with at least one pulse oximetry data failure greater or equal to 10 min was 9.18% based on review of computerized anesthesia records that acquired all pulse oximetry data, including those based on low-amplitude signals. The predictors of pulse oximetry data failure were: ASA physical status 3, 4, or 5; cardiac, vascular, and orthopedic surgery; hypothermia; hypotension; hypertension; and duration of procedure.
*New York State Hospital Code. Section XII 405.13 B.2.c.iv., 1990.
**New Jersey Register. 21 NJR 504, February 21, 1989.
***American Society of Anesthesiologists: Standards for Basic Intraoperative Monitoring. Park Ridge, American Society of Anesthesiologists, 1990.