“Hand et al. demonstrated what Osler chided us to recognize long ago: physicians learn best by doing.”

Image: J. P. Rathmell.

WILLIAM Osler, M.D. (1849–1919; Physician-in-Chief, Johns Hopkins Hospital and Professor of Medicine, Johns Hopkins University School of Medicine, Baltimore, Maryland), the Canadian physician often called the “Father of Modern Medicine,” has been credited with the creation of 20th century medical education. What may have arguably been Osler’s most radical and important contribution to medical education was his belief that lecture halls were necessary but not sufficient to form great physicians. Applying adult education theory, that is, making learning relevant, Osler created the medical clerkship and sleep-in residency systems, bringing undergraduate and graduate medical students out of the lecture hall very early in their training. He recognized that applied learning was far superior to memorizing facts heard in the lecture hall.

Curiously, medical students, residents, and fellows still spend countless hours in the lecture hall and similar passive didactic sessions. Continuing Medical Education programs, although more interactive than ever before, still have a long way to go in this regard. One only needs to view the American Society of Anesthesiologists’ Annual Meeting syllabus to appreciate how many passive lecture hours are included in the program. In anesthesiology residency and fellowship programs, as with other specialties, there is an expectation that educational time should be “protected” from clinical duties, allowing trainees to attend didactic sessions instead. For those of us working tirelessly to ensure the education of the next generation of physicians, we are left thinking, “Is there no better way to do this?”

In this vein, we were excited to read the study by Hand et al.1  in this month’s Anesthesiology: “Effect of a cognitive aid on adherence to perioperative assessment and management guidelines for the cardiac evaluation of noncardiac surgical patients.” The authors designed an electronic smartphone “app” decision support tool for anesthesiologists. They demonstrated that interacting with the decision support tool in an “open book” method for learning resulted in applied learning that both improved adherence to recommended patient care guidelines and decreased healthcare costs. The effect of the new tool on the management of patients in the simulated scenarios decreased with subsequent practice; that is, practicing caring for patients appropriately using the “open book” tool resulted in improved and long-lasting care of subsequent patients. Hand et al. demonstrated what Osler chided us to recognize long ago: physicians learn best by doing.

This effect is particularly important given the additional information that all study participants were offered a didactic lecture about the American College of Cardiology/American Heart Association guidelines 2 weeks before testing. Furthermore, additional years of training, that is, previous practice making similar clinical decisions, did not have any effect on how efficiently the anesthesiologists learned the clinical knowledge or were able to effectively apply it in patient care.

The conclusions of the study by Hand et al. add to a growing body of evidence that our historically based medical education models need revision. Bloom’s (Benjamin Samuel Bloom, Ph.D., 1913–1999; Educational Psychologist and Professor of Education, University of Chicago, Chicago, Illinois) classic taxonomy of learning teaches us that factual knowledge is the lowest level of cognition.2  In anesthesiology perhaps more than in many other medical specialties, knowing/remembering/memorizing the right thing to do for a test does not necessarily correlate with doing/applying the right thing in real-time patient care.

Case-based written assessments and newer forms of paper or computer-based knowledge examinations attempt to test comprehension and synthesis of facts rather than regurgitation of content without the ability to generate an optimal care plan. This is a step in the right direction but leaves an unacceptably large cognitive gap. This gap has been one of the primary drivers behind the ongoing investment of our specialty in expanding the role of simulated patients in both initial certification and Maintenance of Certification in Anesthesiology. Simulation allows an opportunity to observe a physician’s application of knowledge to a patient care scenario. The American Board of Anesthesiology is a champion of this change in approach and will implement Objective Structured Clinical Examinations in its planned modification to the Board Certification process in addition to already including practice performance assessment in its Maintenance of Certification in Anesthesiology process.

The lack of effect of the participant’s level of training in Hand’s study implies that although knowledge presumably improves (based on standardized test results) as trainees progress through our system, application of that knowledge may not. It seems that practice does not “make perfect.” Instead, only “perfect practice makes perfect.” In our complex, high-tech, over-extended healthcare system and particularly in our teaching hospitals, there is a constant balance between student provider autonomy and close teacher supervision. There is an oft-postulated premise that patients suffer if there is insufficient supervision, yet future patients will suffer if there is excessive supervision that reduces the opportunity for learning.

How best then to teach and learn safe provider autonomy? An answer: the teaching hospital of today has the opportunity to provide both “perfect practice” via simulation and use of decision support tools and an imperative to provide consistently safe and standardized patient care. We are intrigued by the use of the “open book” decision support tool methodology of this study. In keeping with actual patient care, trainees in this study were not asked to memorize a series of facts, but rather to use available resources to choose the best care for the patients. Historically, there has been a belief that the best physicians knew the most facts. Clearly, there is benefit when physicians possess a strong medical knowledge foundation.

We also know that the very smart sole provider does not provide the best care. Care is best provided through medical teams where excellent communication, well-defined roles, and appropriate use of resources including decision support tools lead to improved outcomes. A group of experts does not make an expert team, and smart physicians are not necessarily effective. In the environment of exponentially increasing factual medical/anesthesiology knowledge, it has become increasingly important to use decision support tools if one wants to offer the best care to patients. The challenge for us as medical educators will be to determine how best to teach (and indeed even what to teach) the next generation of physicians. Countless hours could be spent “talking at” trainees about essential practice guidelines such as those from the American College of Cardiology/American Heart Association, but to what end? Students will likely retain little and correctly apply less.

Fortunately, we will surely find an ever-increasing number of decision support tools available for our patient care activities and many will be on our smartphones and electronic tablets. Decision support tools similar to that developed by Hand et al. are exampled by “Pediatric Critical Events Checklist,”* a compilation of 18 life-threatening scenarios and their accepted treatment protocols; they compromise a vast amount of facts that may be fleeting memories when needed most for a critically ill patient were it not for this readily available decision support tool.

Decision support tools are all intended to enhance learning and patient care by providing educational content that can be accessed and applied in real time by anesthesiologists and other clinical team members. We applaud Hand et al. for their creation of this tool and for demonstrating a scientific methodology to validate it. For the future, we encourage the development of a central repository of free “guideline-based” support tools. If William Osler had access to a smartphone or electronic tablet and high-tech patient simulation equipment, we expect he may have outlawed memorization of facts spewed forth in lecture halls and rather demanded opening decision support tools in simulation as well as in real patient care.

The authors are not supported by, nor maintain any financial interest in, any commercial activity that may be associated with the topic of this article.

1.
Hand
WR
,
Bridges
KH
,
Stiegler
MP
,
Schell
RM
,
DiLorenzo
AN
,
Ehrenfeld
JM
,
Nietert
PJ
,
McEvoy
MD
:
Effect of a cognitive aid on adherence to perioperative assessment and management guidelines for the cardiac evaluation of noncardiac surgical patients.
Anesthesiology
2014
;
120
2.
Bloom
BS:
:
Taxonomy of Educational Objectives, Handbook I: The Cognitive Domain
.
New York
,
David McKay Co Inc.
,
1956