Introduction: Clinical reasoning is the complex cognitive process that drives the diagnosis of disease and treatment of patients. There is a national call for medical educators to develop clinical reasoning curricula in undergraduate medical education. To address this need, we developed a longitudinal clinical reasoning curriculum for internal medicine clerkship students. Methods: We delivered six 1-hour sessions to approximately 40 students over the 15-week combined medicine-surgery clerkship at Penn State College of Medicine. We developed the content using previous work in clinical reasoning, including the American College of Physicians’ Teaching Medicine Series book Teaching Clinical Reasoning. Students applied a clinical reasoning diagnostic framework to written cases during each workshop. Each session followed a scaffold approach and built upon previously learned clinical reasoning skills. We administered a pre- and postsurvey to assess students’ baseline knowledge of clinical reasoning concepts and perceived confidence in performing clinical reasoning skills. Students also provided open-ended responses regarding the effectiveness of the curriculum. Results: The curriculum was well received by students and led to increased perceived knowledge of clinical reasoning concepts and increased confidence in applying clinical reasoning skills. Students commented on the usefulness of practicing clinical reasoning in a controlled environment while utilizing a framework that could be deliberately applied to patient care. Discussion: The longitudinal clinical reasoning curriculum was effective in reinforcing key concepts of clinical reasoning and allowed for deliberate practice in a controlled environment. The curriculum is generalizable to students in both the preclinical and clinical years.
- Identify the key concepts and terminology pertaining to clinical reasoning, including metacognition, dual process theory, illness scripts, and cognitive biases.
- Apply clinical reasoning skills to patient care, including data organization, development of summary statements, problem representation, and prioritized differential diagnosis.
- Recognize common cognitive biases encountered in clinical practice through the practice of metacognition.
Clinical reasoning is the complex process that drives the diagnosis and treatment of patients.1 Errors in diagnostic reasoning are estimated to occur in up to 15% of patient encounters and may result in significant morbidity and mortality.2 Clinical reasoning is also key to the development of entrustable professional activities for entering residency.3 Furthermore, among medical students and residents who require remediation, the majority of academic difficulties lie within clinical reasoning, rather than a lack of medical knowledge.4
Recent work showed that medicine clerkship directors believe clinical reasoning curricula should be taught in the preclinical and clinical years and that most students have only a fair or poor understanding of clinical reasoning.5 Despite this, less than half of surveyed medical schools have structured clinical reasoning curricula.5 To date, many published curricula in clinical reasoning are single sessions and have largely been developed for students in the preclinical years.6-9
A review of clinical reasoning teaching strategies suggests that knowledge-oriented, case-based teaching is effective at improving the diagnostic accuracy of learners, whereas process-oriented teaching alone may be less effective.10 There is a paucity of data regarding the effectiveness of combining these approaches, especially during the clinical years. Furthermore, few studies have evaluated the effectiveness of longitudinal clinical reasoning curricula. We aimed to incorporate both knowledge-oriented, case-based teaching and process-oriented teaching to develop a longitudinal clinical reasoning curriculum for internal medicine clerkship students.
We delivered the curriculum in six 1-hour sessions, every other week, to approximately 40 students over the 15-week medicine-surgery combined clerkship. We held sessions in an auditorium with space for 50 students as well as a room for breakout sessions. Required equipment included a computer and projector for PowerPoint slides. A facilitator delivered each session. This facilitator was Nicholas S. Duca, a general internist, junior faculty member, and associate clerkship director of the internal medicine clerkship, with experience in small-group and large-group facilitation.
We developed the content from a variety of resources, including the American College of Physicians’ Teaching Medicine Series book Teaching Clinical Reasoning1 and other published works.11,12 Prior to each session, we distributed a printed clinical reasoning worksheet (Appendix A) for students to apply to written cases. The framework was as follows:
- Step 1: identifying the presenting complaint.
- Step 2: identifying key clinical findings.
- Step 3: crafting a summary statement.
- Step 4: creating a problem list.
- Step 5: developing a differential diagnosis.
- Step 6: finalizing a plan.
Each session featured its own instructor guide (Appendices B-G) to accompany its PowerPoint slide set (Appendices H-M) as well as (in the second through sixth sessions) a clinical case that was provided to students (Appendices N-R). We also developed content summaries of each session for students to review in the days prior to the session (Appendices S-X). These content summaries added the feature of flipped classroom instruction and served as a student guide for clinical decision-making.
The first session included key topics such as dual process theory, hypothetico-deductive reasoning, Bayesian reasoning, and an introduction to the clinical reasoning framework. We utilized a scaffold approach, with the remaining five sessions building upon previous content. After a large-group introduction to the topic of the day, students divided into small groups to work through written cases and complete the clinical reasoning worksheet. Students reported their conclusions from small-group discussions, and key concepts were debriefed as a large group. In the third session, we included an activity that prompted students to evaluate and correct previously written summary statements. A facilitator guide accompanies these student cases (Appendices Y & Z).
We developed a pre- and postsurvey (Appendix AA) to assess students’ baseline knowledge of clinical reasoning concepts (dual process theory, hypothetico-deductive reasoning, biases, metacognition, heuristics, Bayesian reasoning, and knowledge chunking) and perceived confidence in performing specific clinical reasoning skills (recognizing illness scripts, using semantic qualifiers, identifying key clinical findings, formulating problem lists and summary statements, developing a differential diagnosis, and recognizing bias). We utilized a scale with 1 indicating poor understanding or confidence, 5 indicating full understanding or confidence, and 0 indicating that the student had never heard of the concept or skill. We analyzed pre- and postcurriculum data using Wilcoxon signed rank tests to compare mean pre- and posttest responses. We also gathered qualitative data from an end-of-course evaluation to obtain student perceptions and critiques of the curriculum.
The response rate for the pre- and posttest survey was 60% (24 of 40 students completed both pre- and posttest surveys). Cronbach’s alphas for perceived knowledge questions (numbers 1-7) and perceived confidence questions (numbers 8-14) were .66 and .84, respectively. The mean response for perceived knowledge of clinical reasoning skills significantly increased in all seven domains:
- Dual process theory: pretest M = 3.29, posttest M = 4.21, p = .001.
- Hypothetico-deductive reasoning: pretest M = 1.46, posttest M = 3.88, p < .001.
- Biases: pretest M = 3.50, posttest M = 4.42, p < .001.
- Metacognition: pretest M = 2.75, posttest M = 3.83, p < .001.
- Heuristics: pretest M = 1.54, posttest M = 2.88, p < .001.
- Bayesian reasoning: pretest M = 0.67, posttest M = 2.54, p < .001.
- Knowledge chunking: pretest M = 1.21, posttest M = 3.58, p < .001.
In addition, the mean response for perceived confidence in performing clinical reasoning skills significantly increased in all seven domains:
- Illness scripts: pretest M = 1.83, posttest M = 4.29, p < .001.
- Semantic qualifiers: pretest M = 3.67, posttest M = 4.58, p < .001.
- Key clinical findings: pretest M = 3.67, posttest M = 4.29, p = .011.
- Problem lists: pretest M = 3.67, posttest M = 4.13, p = .028.
- Summary statements: pretest M = 3.25, posttest M = 4.42, p < .001.
- Differential diagnosis: pretest M = 2.75, posttest M = 4.17, p < .001.
- Recognizing bias: pretest M = 2.79, posttest M = 4.00, p < .001.
We gathered students’ open-ended responses to the question “Please describe the aspects of the course that you found to be most helpful for the development of your clinical reasoning.” Students commented on the usefulness of having a framework to approach clinical cases, of having a structured environment to approach clinical reasoning, and of hearing other students and faculty think aloud. Representative responses included the following:
- “[A] systematic format for going through cases [was most useful]. Appropriate pace to allow time to think. [It] forced us to think about the ‘why’ not the ‘what’!”
- “I liked working through cases [using] team-based learning . . . practicing what we do on rounds but in a structured environment.”
- “I always love talking through cases. I think it’s helpful to see other students’ and faculty’s summary statements.”
The longitudinal clinical reasoning curriculum, which incorporated both knowledge-oriented and process-based instruction, was well received by clerkship students and led to increased perceived knowledge and confidence in utilizing clinical reasoning terminology and skills. Based on the qualitative data obtained through the end-of-curriculum evaluation, students found think-aloud techniques to be useful for the development of clinical reasoning and appreciated hearing the thought processes of other classmates and attending physicians. Think-aloud teaching is supported in the clinical reasoning literature as an effective modeling technique for student and resident learning.11,13
Students commented on the utility of practicing clinical reasoning in a structured environment and appreciated having a clinical reasoning framework to apply to patient care. Published curricula have demonstrated the effectiveness of clinical reasoning frameworks for preclinical and clinical students.8,14 Furthermore, students reported that they did not always have the opportunity to practice or communicate their clinical reasoning during patient care activities, making this classroom-based activity a useful venue for practice.
A limitation to this curriculum is that delivery of the content relied on faculty members with specific knowledge of clinical reasoning concepts. We hope to implement faculty development in teaching clinical reasoning to promote expertise. In addition, one faculty member delivered all the sessions, which may not be feasible at all institutions. Because of this, students commented that some of the sessions were repetitive. In subsequent iterations of this curriculum, guest speakers, including academic hospitalists and general internists, were invited to discuss their approach to clinical reasoning. These sessions were highly rated by students. Recent work has shown the effectiveness of this format in teaching and modeling clinical reasoning, and we hope to incorporate example-based learning in future sessions.15
A second limitation is that our assessment tool measured student perception rather than being an objective measure of clinical reasoning. Options for objective measurements of clinical reasoning in future iterations of the curriculum include use of observed structured simulated cases and tools for assessing diagnostic accuracy such as the script concordance test or key-features exam.16-19
Overall, this longitudinal clinical reasoning curriculum was an effective means of communicating key concepts of clinical reasoning to students and allowed for deliberate practice of clinical reasoning in a controlled environment. The curriculum is generalizable to students in the preclinical years as well as to students in the clinical years within any clerkship. Furthermore, the framework utilized in this curriculum can be adapted for graduate medical education, and the cases can also be adapted to reflect more complex patient presentations.
Thank you to Melissa McNeil, MD, MPH, Eliana Bonifacino, MD, Deborah DiNardo, MD, Dan Wolpaw, MD, Paul Haidet, MD, MPH, Jed Gonzalo, MD, MSc, Alan Adelman, MD, and Nancy Adams, MLIS, EdD, for their contributions to the curriculum. Thank you to Erik Lehman for statistical analysis.
None to report.
None to report.
Duca N, Glod S. Bridging the gap between the classroom and the clerkship: a clinical reasoning curriculum. Presented at: Alliance for Academic Internal Medicine: Academic Internal Medicine Week; March 18-21, 2018; San Antonio, TX.
The Human Subjects Protections Office at Penn State College of Medicine approved this study.
- Trowbridge RL, Rencic JJ, Durning SJ, eds. Teaching Clinical Reasoning. Philadelphia, PA: American College of Physicians; 2015.
- Graber ML. The incidence of diagnostic error in medicine. BMJ Qual Saf. 2013;22(suppl 2):ii22-ii27. https://doi.org/10.1136/bmjqs-2012-001615
- The Core Entrustable Professional Activities (EPAs) for entering residency. Association of American Medical Colleges website. https://www.staging.aamc.org/initiatives/coreepas. Published 2014. Accessed December 17, 2017.
- Audétat M-C, Laurin S, Dory V, Charlin B, Nendaz MR. Diagnosis and management of clinical reasoning difficulties: Part I. Clinical reasoning supervision and educational diagnosis. Med Teach. 2017;39(8):792-796. https://doi.org/10.1080/0142159X.2017.1331033
- Rencic J, Trowbridge RL Jr, Fagan M, Szauter K, Durning S. Clinical reasoning education at US medical schools: results from a national survey of internal medicine clerkship directors. J Gen Intern Med. 2017;32(11):1242-1246. https://doi.org/10.1007/s11606-017-4159-y
- van Gessel E, Nendaz MR, Vermeulen B, Junod A, Vu NV. Development of clinical reasoning from the basic sciences to the clerkships: a longitudinal assessment of medical students’ needs and self‐perception after a transitional learning unit. Med Educ. 2003;37(11):966-974. https://doi.org/10.1046/j.1365-2923.2003.01672.x
- Weinstein A, Pinto-Powell R. Introductory clinical reasoning curriculum. MedEdPORTAL. 2016;12:10370. https://doi.org/10.15766/mep_2374-8265.10370
- Levin M, Cennimo D, Chen S, Lamba S. Teaching clinical reasoning to medical students: a case-based illness script worksheet approach. MedEdPORTAL. 2016;12:10445. https://doi.org/10.15766/mep_2374-8265.10445
- Durning SJ, LaRochelle J, Pangaro L, et al. Does the authenticity of preclinical teaching format affect subsequent clinical clerkship outcomes? A prospective randomized crossover trial. Teach Learn Med. 2012;24(2):177-182. https://doi.org/10.1080/10401334.2012.664991
- Posel N, McGee JB, Fleiszer DM. Twelve tips to support the development of clinical reasoning skills using virtual patient cases. Med Teach. 2015;37(9):813-818. https://doi.org/10.3109/0142159X.2014.993951
- Houchens N, Harrod M, Fowler KE, Moody S, Saint S. How exemplary inpatient teaching physicians foster clinical reasoning. Am J Med. 2017;130(9):1113.e1-1113.e8. https://doi.org/10.1016/j.amjmed.2017.03.050
- Bordage G. Elaborated knowledge: a key to successful diagnostic thinking. Acad Med. 1994;69(11):883-885. https://doi.org/10.1097/00001888-199411000-00004
- Pottier P, Hardouin J-B, Hodges BD, et al. Exploring how students think: a new method combining think‐aloud and concept mapping protocols. Med Educ. 2010;44(9):926-935. https://doi.org/10.1111/j.1365-2923.2010.03748.x
- Strowd R, Kwan A, Cruz T, Gamaldo C, Salas R. A guide to developing clinical reasoning skills in neurology: a focus on medical students. MedEdPORTAL. 2015;11:10163. https://doi.org/10.15766/mep_2374-8265.10163
- Dinardo D, Tilstra S, Painter T. Thinking out loud: using principles from “example based learning” in a clinical reasoning case conference format. Paper presented at: Alliance for Academic Internal Medicine: Academic Internal Medicine Week; March 18-21, 2018; San Antonio, TX.
- Durning SJ, Artino A, Boulet J, et al. The feasibility, reliability, and validity of a post-encounter form for evaluating clinical reasoning. Med Teach. 2012;34(1):30-37. https://doi.org/10.3109/0142159X.2011.590557
- DiNardo D, Tilstra S, McNeil M, et al. Identification of facilitators and barriers to residents’ use of a clinical reasoning tool. Diagnosis (Berl). 2018;5(1):21-28. https://doi.org/10.1515/dx-2017-0037
- Charlin B, Roy L, Brailovsky C, Goulet F, van der Vleuten C. The Script Concordance test: a tool to assess the reflective clinician. Teach Learn Med. 2000;12(4):189-195. https://doi.org/10.1207/S15328015TLM1204_5
- Hrynchak P, Takahashi SG, Nayer M. Key-feature questions for assessment of clinical reasoning: a literature review. Med Educ. 2014;48(9):870-883. https://doi.org/10.1111/medu.12509
This is an open-access publication distributed under the terms of the Creative Commons Attribution-NonCommercial-Share Alike license.
Received: August 1, 2018
Accepted: December 23, 2018