Introduction: Evaluation of clinical medical students is challenging. Most clinical course directors rely on a variety of assessment tools to evaluate students. We have developed an oral examination to add to the mix of evaluation tools used for medical students on an internal medicine clerkship. Methods: Materials include detailed directions on how to develop and implement a standardized oral examination that assesses clinical reasoning and examples of examinations that test students’ knowledge of 18 common internal medicine symptoms. Results: Analysis suggests that we fulfilled the goals we set. Our most important goal was to direct students to study clinical reasoning. We assessed this by surveying students about their studying resources the year before we instituted the oral exam and the year we included it in the clerkship. Problem-based texts, which tend to focus on clinical reasoning, became more widely used, whereas board review books were used less. There was no change in the use of disease-based reference texts. Our second goal was to develop an exam that was standardized and objective in its evaluation of student performance. Our exam was certainly reproducible. In a year that we employed two examiners independently scoring the students, examiner agreement was very high, with an average disagreement of only 0.55 ± 0.72 points on a 16-point scale. There were no disagreements as to whether a student passed or failed among the 104 examinations administered. The exam seemed to differentiate students whose clinical reasoning ability was strong from those who needed work. The exam also correlated reasonably well with students’ clinical performance and performance on the USMLE subject exam. Students viewed the exam favorably. In course evaluations, they rated it as fairly administered (4.5 ± 1.1 on a 5-point scale) and felt it covered the material taught in the clerkship better than the written shelf exam (4.8 vs. 3.7, p = 0.0001). A number of students wrote that they found taking the exam to be an educational experience. Discussion: We were pleased that the oral exam provided an opportunity for course directors to interact with every student in a concentrated evaluation of their medical knowledge and clinical reasoning skill. This proved useful in future discussions of student performance as the course directors were never left relying solely on other people’s assessments of students. Having developed a reproducible and standardized oral exam, our next step is to determine the reliability of the exam and its ability to evaluate clinical reasoning skills. To the extent that clinical reasoning skills are content dependent, we need to assure that students who perform well on one case would perform equally well on other cases.
- Teach course directors how to develop standardized oral examinations that test clinical reasoning.
- Produce exams that have a positive influence on students’ studying, focusing efforts on the mastery of clinical reasoning skills.
- Produce exams that are standardized and objective in their evaluation of student performance.
- Produce exams that are acceptable to students.
- Produce exams that provide information about the strengths and weaknesses of students’ clinical reasoning skills that could be used for future curricular development.
This is an open-access article distributed under the terms of the Creative Commons Attribution license.