We write in reply to Dr. Kempen’s recent comments on our Editorial View in the January 2014 issue of Anesthesiology.1  Dr. Kempen is critical of the Objective Structured Clinical Examination (OSCE) and our lack of transparency as authors of this opinion piece in revealing our direct relationship with the American Board of Anesthesiology (ABA). We apologize for the incorrect disclosure statement that assured readers that we had no conflicts of interest in preparing the piece, as we most assuredly do. As Dr. Kempen points out, Drs. Rathmell and Lien receive an annual stipend for our work as ABA Directors and Dr. Harman, as ABA’s Chief Assessment Officer, is an ABA employee. Although we were listed as being affiliated with the ABA on the first page of the editorial, we should have been far more explicit in directing readers to this conflict: we are among the leaders who oversee all activities of the ABA, including the new OSCE examinations. With that said, let us turn to Dr. Kempen’s criticism of the OSCE itself, which he calls “costly,” “time consuming,” and as “present[ing] difficulty in development and evaluation … regarding quality, safety, and validity.” The rationale for preparing the article was to describe for diplomates, examination candidates, and the public the careful deliberations that went in to the ABA’s decision to incorporate the OSCE examination in to primary board certification for anesthesiologists and to detail the challenges that lie ahead in assuring that the new examination is valid and adds value for diplomates and their patients alike. We took pains to be transparent in our reasoning and to remain self-critical and we did address many of the very criticisms with which Dr. Kempen seems to agree. The introduction of OSCEs can be used to drive education; in U.S. Medical Schools, addition of clinical skill assessment as a requirement for licensure has positively shaped education, assuring that physicians emerge with the skills that are fundamental to patient care.2  In addition, as detailed in the editorial,1  many studies have documented the validity of the OSCE as an additional measure of professionalism, communication, and clinical practice—all of which are difficult to evaluate with a computer-based examination alone. OSCEs are already widely used for licensure, training in medical schools, and certification in other countries. And, we are quite aware of the challenges associated with the development of reliable and valid OSCEs. An advisory panel and survey of examiners, program directors, chairs, and leaders of large private practice organizations all weighed in, describing areas in which they felt new graduates were not adequately trained, and these are the areas that are driving the initial content of the new OSCEs. Dr. Kempen’s primary focus seems to be on the cost of the examination. He cites an opinion piece that ran in the New England Journal of Medicine in 2013 that questioned the value of the U.S. Medical Licensing Examination Clinical Skills Exam based on a purely economic analysis.3  However, OSCEs need not be expensive. In fact, with the Accreditation Council for Graduate Medical Education requirement for many different varieties of evaluations—including those based on direct observation and case discussion—a type of informal OSCE will be a routine part of residency training. Finally, from our very first discussions about adding OSCEs to the board certification examinations, one of the guiding principles adopted by the ABA Directors has been that the cost of certification must not increase as we introduce the new OSCE examinations.

The authors declare no competing interests.

1.
Rathmell
JP
,
Lien
C
,
Harman
A
:
Objective structured clinical examination and board certification in anesthesiology.
Anesthesiology
2014
;
120
:
4
6
2.
First
LR
,
Chaudhry
HJ
,
Melnick
DE
:
Quality, cost, and value of clinical skills assessment.
N Engl J Med
2013
;
368
:
963
4
3.
Lehman
EP
IV
,
Guercio
JR
:
The Step 2 Clinical Skills exam—A poor value proposition.
N Engl J Med
2013
;
368
:
889
91