As of August 2023, more than 180 million people have used the large language model Chat Generative Pre-Trained Transformer (ChatGPT).1 There is emerging evidence that large language models can solve novel complex problems in a humanlike way.2 If proven accurate, large language models such as ChatGPT could facilitate real-time perioperative decision-making by giving anesthesiologists rapid access to evidence-based responses to clinical queries. ChatGPT performed “at or near the passing threshold” for the United States Medical Licensing Examination (USMLE) and displayed evidence of deductive reasoning.3 ChatGPT also performed well on board exam questions in various medical specialties, including neurosurgery and neonatology.4,5 While questions from the American Board of Anesthesiology (Raleigh, North Carolina) questions are not publicly available, previous researchers have used questions from a board examination preparation book and found that ChatGPT also performs satisfactorily on these anesthesiology questions.6 One notable concern of large...
Performance of a Large Language Model on the Anesthesiology Continuing Education Exam
Anesthesiology Continuing Education (ACE) is copyrighted by the American Society of Anesthesiologists (ASA; Schaumburg, Illinois) and was used with permission from ASA for this study. Guidance on how to submit a permission request to use ASA’s copyrighted material is available on ASA’s website at https://www.asahq.org/research-and-publications/publications/requests-to-reprint-asa-publications.
This research was presented at the annual meeting of the American Society of Anesthesiologists in Philadelphia, Pennsylvania, October 18 to 22, 2024.
(Accepted for publication July 10, 2024.)
Vardaan Gupta, Yang Gu, Stewart J. Lustik, Won Park, Shichen Yin, Daniel Rubinger, Francis M. Chang, Kunal Panda, Soroush Besharat, Hamza Sadhra, Laurent G. Glance; Performance of a Large Language Model on the Anesthesiology Continuing Education Exam. Anesthesiology 2024; 141:1196–1199 doi: https://doi.org/10.1097/ALN.0000000000005181
Download citation file: