Original Publication
Open Access

Struggling Medical Learners: A Competency-Based Approach to Improving Performance

Published: August 15, 2018 | 10.15766/mep_2374-8265.10739


  • Lesson Plan.docx
  • PowerPoint Slides.pptx
  • Welcome Script.docx
  • Student Profile Sheet.docx
  • Standardized Student Script.docx
  • Struggling Medical Learner Handout.docx
  • Evaluation Form.docx

All appendices are peer reviewed as integral parts of the Original Publication.

To view all publication components, extract (i.e., unzip) them from the downloaded .zip file.


Introduction: Faculty must be trained to recognize, analyze, and provide feedback and resources to struggling medical learners. Training programs must be equipped to intervene when necessary with individualized remediation efforts to ensure learner success. Methods: This 90-minute interactive faculty development workshop provides a foundational competency-based framework for identifying and assisting the struggling medical learner. The workshop uses a mock academic promotions committee meeting addressing the case of a struggling undergraduate learner. The workshop was presented at two regional conferences, and participants completed an anonymous evaluation form containing 10 items on a 5-point Likert scale and two open-ended questions. Data were analyzed and a subgroup analysis performed using an independent t test and correlation. Qualitative data were read and coded for representative themes by two authors. Results: Fifty-five participants completed an evaluation form. The quality of the workshop was high (M = 4.5, SD = 0.6); participants agreed that the learning objectives were achieved and relevant to their educational needs (M = 4.4, SD = 0.7). A significant positive correlation existed between perceived quality and the interactive elements (.70, p < .05) as well as the intention to apply learning (.60, p < .05). Written comments revealed six themes: role-play, resources, interaction with colleagues, modeling, relevant content, and the process of learning. Discussion: The workshop’s quality, relevance, and applicability were rated excellent among medical educators. Participants felt the interactive nature of the workshop was its most useful aspect, and a majority intended to apply the learning to their practice.

Educational Objectives

By the end of the workshop, participants will be able to:

  1. Define struggling medical learner.
  2. Diagnose struggling medical learners’ deficits.
  3. Demonstrate communication strategies to engage struggling medical learners.
  4. Explain a process for delivering feedback to a struggling medical learner.
  5. Describe strategies for enhancing corrective change in a struggling medical learner.
  6. Create an individualized learning plan for a struggling medical learner.


Most trainees usually struggle at one time or another with some aspect of medical training: medical knowledge, professionalism, work-life balance, interpersonal stress, or even substance abuse or mental health crises. A struggling medical learner is one who is “significantly below performance potential because of a specific affective, cognitive, structural, or interpersonal difficulty.”1 It is imperative that all faculty are trained to recognize and provide appropriate feedback and resources to struggling medical learners throughout their developmental trajectory, especially given the increasing focus on assessing students using a competency- and entrustment-based medical education framework. Early learning struggles can often predict future educational or professionalism difficulties, a situation that underlines the importance of proper identification and timely intervention for struggling learners.2-4

The Accreditation Council for Graduate Medical Education has identified six core competency domains for graduate trainees5 and outlined specialty-specific developmental milestones for graduate-level trainees.6 While there are currently no universal undergraduate milestones, there is a growing interest in the undergraduate medical education/graduate medical education transition to ensure that students have the requisite skills to succeed in residency, as evidenced by the formation and work of the Association of American Medical Colleges Core Entrustable Professional Activities for Entering Residency Pilot Group.7-10

Remediation is correction of an underlying learning deficit; thus, it is not always linked to pass-fail or academic promotions decisions.11 In a true competency-based medical education approach, all learners progress along similar developmental trajectories albeit at differing paces.12 Learners should receive coaching and feedback in a supportive environment that allows them to identify and improve deficiencies and progress toward competence and entrustment using deliberate practice followed by reassessment.12-15 Training programs must have a method for monitoring student progress, identifying learning needs, and coaching all learners along their developmental path.11,16-18 Learners with persistent deficits after coaching may benefit from more traditional interventions (e.g., repeating courses, attending remediation sessions, mandatory clinical experiences, etc.) with targeted and individualized remediation measures to ensure learner success. Often, faculty have difficulty identifying the exact learning deficit(s) of struggling learners and determining the appropriate corrective coaching strategies.11

To ensure the success of every struggling medical learner, each institution must develop and maintain a system for identifying, coaching, and reassessing struggling students.11,16 Institutional structures and policies differ in how remediation programs interface with student affairs, academic promotions committee decisions, and the various faculty roles within each. Although remediation efforts are often successful, they are also resource intensive.19 Given the significant costs in terms of faculty time, student effort, and resources, faculty and academic promotions committees must clearly define at what point remediation efforts are no longer fruitful.11,16

Faculty experience and training in providing feedback are variable. Even faculty who are familiar with feedback models and techniques may still feel uncomfortable targeting feedback to a learner who struggles across multiple competency domains. Competency-based medical education relies on a cadre of faculty who are trained in and comfortable with providing feedback to struggling medical learners.12 Faculty should have a mental framework for struggling learners and be able to confidently discern the specific learning need in order to suggest targeted and effective interventions. To do this, faculty must be able to identify the specific learning need, effectively communicate with the learner about their observations, and use a shared decision-making approach to developing an intervention plan.11 Competency-based frameworks exist11; however, a review of the literature, including publications in MedEdPORTAL, revealed a lack of specific faculty development workshops or tools that help faculty learn and implement these frameworks.

This workshop was created and piloted with a group of master’s (Heather Ridinger, Jamie Cvengros, and Joseph Rencic) and doctoral (Pedro Tanaka and James Gunn) students in the University of Illinois Chicago’s Department of Medical Education. The pilot received positive feedback and was concurrently submitted and accepted for presentation at two regional meetings. This workshop addresses the practical aspects of training faculty to interact effectively with struggling medical learners. It was designed for all faculty who interact with students to provide competency-based assessments, particularly within undergraduate medical education. However, because faculty often have dual roles within undergraduate and graduate medical education, we designed the concepts within the workshop to be easily applied or modified to fit all levels of medical training as well as other health professions.

We used active learning activities throughout this unique workshop as a way to engage faculty and create opportunities for application of the material. Active learning is a critical part of transformative learning theory in which adults learn by applying and critically appraising new knowledge.20 For this reason, the workshop is delivered as a mock academic promotions committee meeting discussing the case of an undergraduate medical learner with borderline performance after completing one clerkship, engaging participants from the outset by placing them in an active learning scenario as members of the committee. The session is interspersed with think-pair-share activities for participants to apply their knowledge. Participants learn about effective feedback,21,22 as well as about assessing the stages of change and using motivational interviewing techniques to discuss learning difficulties.23,24 The workshop also includes an opportunity for one of the participants to interact with a standardized student to apply the participant’s skills in identifying learning needs and communicating with the student. Participants work together to collaboratively create an appropriate individualized learning plan for the student. This workshop is an educational intervention designed for faculty who interact with or provide feedback to struggling medical learners; active learning techniques are used to improve applicability and participant engagement.


The 90-minute workshop was presented by two authors (Heather Ridinger/James Gunn and Heather Ridinger/Jamie Cvengros) at two regional conferences in 2017. Conference attendees interested in the subject attended the workshop. No previous knowledge or experience was required ahead of the session. The lesson plan (Appendix A), including suggested timing and notes for each segment of the session, guided facilitators. A set of PowerPoint slides (Appendix B) accompanied the presentation. The workshop started with a mock academic promotions committee meeting. One presenter took the role of the chair of the academic promotions committee and welcomed workshop participants to the first meeting of the year. He or she outlined the agenda for the meeting (Appendix C) and announced that the committee was going to discuss a single undergraduate student struggling during the course of her clinical clerkship training.

From the outset, participants learned how to identify specific learning deficits using a competency-based framework by applying it to a specific student’s case file (Appendix D). Once participants had discussed possible learning deficits, one of the participants had an opportunity to interact with a standardized student and apply his or her skills in homing in on specific needs and communicating with the student about the participant’s observations. The standardized student came prepared, having read the case materials (Appendix E); however, no specific training was required ahead of time. Facilitators debriefed the interaction with the participants and the standardized student, specifically pointing out communication barriers and highlighting motivational interviewing techniques. Participants then worked together to collaboratively create an appropriate individualized learning plan using a handout of targeted strategies for helping students improved based on their specific learning needs (Appendix F).

For feedback purposes, workshop participants were invited to complete an anonymous paper evaluation form (Appendix G), which contained 10 items measured on a 5-point Likert scale (1 = very low, 5 = very high) and two open-ended items soliciting feedback about the most useful elements of the workshop and suggested improvements. Immediately following each presentation, evaluation forms were collected and securely stored. The evaluation forms did not contain any demographic or identifying information from workshop participants, who included faculty, staff, and students.

The quantitative evaluation data were analyzed on the IBM SPSS Statistics 24 package using mean scores and Cronbach’s alpha. A subgroup analysis was performed using an independent t test on participants who indicated that workshop objectives were highly relevant to their learning needs. Pearson correlation was used to analyze correlation between perceived quality and other evaluation form items. All workshop participants were given the opportunity to provide written comments on the evaluation form by anonymously responding to the question “What specifically did you find useful in this session?” Written comments from evaluation forms were read and coded for themes by two authors with experience in qualitative analysis (Heather Ridinger and Pedro Tanaka). Authors read them independently, and one author (Pedro Tanaka) coded the comments. The themes and representative quotes were reviewed with the other author (Heather Ridinger) for completeness and representativeness.

This study was approved by the Vanderbilt University Medical Center Institutional Review Board.


Quantitative Analysis
Fifty-five participants completed a postworkshop evaluation form. Descriptive statistics for the evaluation are shown in Table 1. The 10-item evaluation form had good internal consistency (Cronbach’s α = .89).

Table 1. Descriptive Statistics for Workshop Evaluation (n = 55)
M Score (SD)a
1. Objectives were relevant to my educational needs.
4.4 (0.7)
2. Define “struggling medical learner.”
4.5 (0.6)
3. Diagnose struggling medical learners’ deficits.
4.2 (0.9)
4. Explain communication strategies to engage struggling medical learners and deliver feedback to a struggling medical learner.
4.4 (0.7)
5. Describe strategies for enhancing corrective change in the struggling medical learner.
4.2 (0.8)
6. Create an individualized learning plan for a struggling learner.
4.0 (0.9)
7. Session impacts my competence in this area.
4.0 (0.8)
8. I intend to apply concepts learned today into my practice.
4.2 (0.8)
9. The quality of the overall session.
4.5 (0.6)
10. The interactive format of the session.
4.6 (0.5)
4.3 (0.7)
aOn a 5-point Likert scale (1 = very low, 3 = moderate, 5 = very high).

The quality of the overall workshop was high (4.5 out of 5, SD = 0.6), and on average, participants agreed that the session objectives were relevant to their educational needs (4.4 out of 5, SD = 0.7) and that the individual session objectives (items 2-6) were achieved (4.0-4.5 out of 5, SD = 0.6-0.9).

To better understand the impact of the workshop on learners who found the topic highly relevant, we identified a subgroup of participants who marked high (4) or very high (5) for the workshop objectives being relevant to their educational needs (item 1). Using data from this subgroup, we performed an independent t test to evaluate whether these participants had a significant difference in their mean responses for the following three items:

  • Item 7: Session impacts my competence in this area.
  • Item 8: I intend to apply concepts learned today into my practice.
  • Item 9: The quality of the overall session.

The results of this analysis revealed that the subgroup of participants who rated the workshop objectives relevant or very relevant also rated the sessions’ impact on their competence (item 7) 0.7 points higher (t51 = 2.38, p < .05) on average than those who found the workshop objectives less relevant. Item 8 did not show a significant difference among the subgroups; however, overall session quality was rated 0.5 points higher by those who found the objectives highly relevant (t53 = 2.44, p > .05).

Pearson correlations were calculated to determine if there was a significant correlation between individual evaluation items. Specifically, we were interested in seeing what aspects of the workshop would correlate with overall perceived quality. These results are presented in Table 2. There was a high correlation between most items, nearly all of which were significant. Notably, a significant positive correlation existed between overall perceived quality of the workshop and the interactive elements (.65, p < .05) and between overall perceived quality of the workshop and the intention to apply the learning (.63, p < .05).

Table 2. Interitem Correlation
1. Relevance
2. Definition
3. Diagnosis
4. Demonstration
5. Strategies
6. Learning
7. Competence
8. Application
9. Quality
10. Interactive
ap < .05.

Qualitative Analysis
Of the 55 participants, 47 (85%) responded with written comments. The frequency of each theme and representative quotes taken from comments on evaluation forms are presented below.

Theme 1. Role-Play (15):

  • “The role play was absolutely necessary and really made strategies tangible.”
  • “The format and role playing were excellent.”
  • “Helped see how it would play out and the words that someone else would use.”
  • “The role-play and modeling were extremely helpful.”
  • “Role play exercise helpful solidifying concepts.”
  • “I really liked the role play and seeing the defensiveness of the student.”
  • “Role play gave session attendees a common reference.”

Theme 2. Resources (12):

  • “The resources provided in the hand-out, particularly on developing a remediation plan.”
  • “Also having the hand-outs to take home and reflect on additional content.”
  • “Well cited presentation. I have a lot of literature to engage with.”

Theme 3. Interaction With Colleagues (11):

  • “Great interactive session.”
  • “Interaction with colleagues.”
  • “Pair share.”
  • “Engaging presentation.”
  • “Discussions to show different points of view.”
  • “I liked the round table discussion.”
  • “Hearing about the process from other institutions.”

Theme 4. Modeling (7):

  • “Modeling how to interview motivational.”
  • “Like seeing an expert doing the interview.”
  • “Great use of demonstration/modeling.”

Theme 5. Relevant Content (6):

  • “It was such a good use of my time and this is what I do for my job.”
  • “Good use of time for interaction overall.”
  • “For me the ending was especially useful where you provide institutional context.”
  • “I think it was so informative and helpful to the attendees.”

Theme 6. Process of Learning (4):

  • “Working through the process quickly.”
  • “Clear step by step plan.”
  • “Test case scenario developed in stages was excellent.”


Overall, this workshop was well received among multiple groups of medical educators from differing regions of the country. Participants valued the interactive nature of the session and thought that the workshop contributed to their own competence in this area. They intended to apply the concepts to their practice. The quality of the session correlated most strongly with the interactive elements of the workshop. Demonstration of motivational interviewing and shared decision-making techniques using a standardized student was a strength. Participants also valued the interactions that they had with one another through think-pair-share activities.

This workshop can serve as a faculty development resource for training faculty from a variety of institutions and backgrounds. The workshop resources are designed to be adaptable in order to meet individual faculty or institutional training needs and can conceivably be adapted to train faculty involved in graduate medical education or even other health professions trainees. Targeting this educational intervention to faculty who value the skills taught in the workshop will increase its applicability and perceived quality. The workshop is not resource intensive in terms of preparation, space, or number of moderators. The standardized student did not require formal training, and this activity can be adaptable into a role-play when trained standardized actors are not feasible. Time was limited in a 90-minute workshop format. Ideally, 2 hours would allow for more robust discussion and group work.

Despite the workshop’s overall success, there are a number of limitations to its generalizability. Not all institutions have adopted a competency-based medical education approach, and faculty may have variable understandings of and experience with a competency framework. Institutional differences in implementing remediation and coaching programs may limit applicability to certain faculty. Furthermore, institutional differences in remediation programs and their academic consequences (e.g., probation, dismissal) may impact faculty members’ understanding and comfort level in their role coaching struggling medical learners.

In a competency-based medical education approach, all students require coaching and continual remediation of learning deficits. Medical educators who have any role or responsibility for student assessment and feedback may improve their efforts with struggling learners by utilizing a framework to identify and assist them with targeted learning strategies. This faculty development workshop provides an active learning activity that serves as an effective educational intervention to improve faculty self-reported competence in identifying and communicating with struggling medical learners and creating a shared targeted intervention plan.

Author Information

  • Heather Ridinger, MD: Assistant Professor, Department of Internal Medicine, Vanderbilt University School of Medicine; Co-course Director, Foundations of Healthcare Delivery Course, Vanderbilt University School of Medicine
  • Jamie Cvengros, PhD: Associate Professor, Department of Behavioral Sciences, Rush Medical College of Rush University Medical Center; Director of Clinical Communication Training & Research, Rush Medical College of Rush University Medical Center
  • James Gunn, PA-C: Associate Professor, Physician Assistant Program, Midwestern University; Director of Didactic Education, Physician Assistant Program, Midwestern University
  • Pedro Tanaka, MD, PhD: Clinical Professor, Department of Anesthesiology, Stanford University School of Medicine; Associate Program Director, Anesthesiology Residency Program, Stanford University School of Medicine; Director, Teaching Scholars Program, Stanford University School of Medicine
  • Joseph Rencic, MD: Associate Professor, Department of Medicine, Tufts University School of Medicine; Associate Program Director, Internal Medicine Residency Program, Tufts University School of Medicine; Co-course Director, Introduction to Clinical Reasoning Course, Tufts University School of Medicine
  • Ara Tekian, PhD, MHPE: Professor, Department of Medical Education, University of Illinois College of Medicine; Associate Dean for International Affairs, Department of Medical Education, University of Illinois College of Medicine
  • Yoon Soo Park, PhD: Associate Professor, Department of Medical Education, University of Illinois College of Medicine

None to report.

None to report.

Prior Presentations
Workshop materials were presented at the Association of American Medical Colleges’ Southern Group on Educational Affairs and Central Group on Educational Affairs regional meetings in 2017.

Ethical Approval
The Vanderbilt University Medical Center Institutional Review Board approved this study.


  1. Vaughn LM, Baker RC, DeWitt TG. The problem learner. Teach Learn Med. 1998;10(4):217-222. https://doi.org/10.1207/S15328015TLM1004_4
  2. de Virgilio C, Yaghoubian A, Kaji A, et al. Predicting performance on the American Board of Surgery qualifying and certifying examinations: a multi-institutional study. Arch Surg. 2010;145(9):852-856. https://doi.org/10.1001/archsurg.2010.177
  3. Papadakis MA, Hodgson CS, Teherani A, Kohatsu ND. Unprofessional behavior in medical school is associated with subsequent disciplinary action by a state medical board. Acad Med. 2004;79(3):244-249. https://doi.org/10.1097/00001888-200403000-00011
  4. Hemann BA, Durning SJ, Kelly WF, Dong T, Pangaro LN, Hemmer PA. The association of students requiring remediation in the internal medicine clerkship with poor performance during internship. Mil Med. 2015;180(suppl 4):47-53. https://doi.org/10.7205/MILMED-D-14-00567
  5. Andolsek K, Padmore J, Hauer KE, Edgar L, Holmboe E. Clinical Competency Committees: A Guidebook for Programs. Chicago, IL: Accreditation Council for Graduate Medical Education; 2015.
  6. Nasca TJ, Philibert I, Brigham T, Flynn TC. The Next GME Accreditation System—rationale and benefits. N Engl J Med. 2012;366(11):1051-1056. https://doi.org/10.1056/NEJMsr1200117
  7. Chen HC, van den Broek WE, ten Cate O. The case for use of entrustable professional activities in undergraduate medical education. Acad Med. 2014;90(4):431-436. https://doi.org/10.1097/ACM.0000000000000586
  8. Brown DR, Warren JB, Hyderi A, et al; for AAMC Core Entrustable Professional Activities for Entering Residency Entrustment Concept Group. Finding a path to entrustment in undergraduate medical education: a progress report from the AAMC Core Entrustable Professional Activities for Entering Residency Entrustment Concept Group. Acad Med. 2017;92(6):774-779. https://doi.org/10.1097/ACM.0000000000001544
  9. Lomis KD, Ryan MS, Amiel JM, Cocks PM, Uthman MO, Esposito KF. Core Entrustable Professional Activities for Entering Residency Pilot Group update: considerations for medical science educators. Med Sci Educ. 2016;26(4):797-800. https://doi.org/10.1007/s40670-016-0282-3
  10. Core Entrustable Professional Activities for Entering Residency: Curriculum Developers’ Guide. Washington, DC: Association of American Medical Colleges; 2014.
  11. Guerrasio J. Remediation of the Struggling Medical Learner. Irwin, PA: Association for Hospital Medical Education; 2013.
  12. Frank JR, Snell LS, ten Cate O, et al. Competency-based medical education: theory to practice. Med Teach. 2010;32(8):638-645. https://doi.org/10.3109/0142159X.2010.501190
  13. Tekian A, Watling CJ, Roberts TE, Steinert Y, Norcini J. Qualitative and quantitative feedback in the context of competency-based education. Med Teach. 2017;39(12):1245-1249. https://doi.org/10.1080/0142159X.2017.1372564
  14. Ericsson KA. Deliberate practice and acquisition of expert performance: a general overview. Acad Emerg Med. 2008;15(11):988-994. https://doi.org/10.1111/j.1553-2712.2008.00227.x
  15. McGaghie WC, Kristopaitis T. Deliberate practice and mastery learning: origins of expert medical performance. In: Cleland J, Durning SJ, eds. Researching Medical Education. Hoboken, NJ: John Wiley & Sons; 2015:219-230.
  16. Kalet A, Guerrasio J, Chou CL. Twelve tips for developing and maintaining a remediation program in medical education. Med Teach. 2016;38(8):787-792. https://doi.org/10.3109/0142159X.2016.1150983
  17. Ellaway RH, Chou CL, Kalet AL. Situating remediation: accommodating success and failure in medical education systems. Acad Med. 2018;93(3):391-398. https://doi.org/10.1097/ACM.0000000000001855
  18. Kalet A, Chou CL, eds. Remediation in Medical Education: A Mid-Course Correction. New York, NY: Springer; 2014.
  19. Guerrasio J, Garrity MJ, Aagaard EM. Learner deficits and academic outcomes of medical students, residents, fellows, and attending physicians referred to a remediation program, 2006–2012. Acad Med. 2014;89(2):352-358. https://doi.org/10.1097/ACM.0000000000000122
  20. Mezirow J. Transformative learning: theory to practice. New Dir Adult Contin Educ. 1997;(74):5-12. https://doi.org/10.1002/ace.7401
  21. van de Ridder JM, Stokking KM, McGaghie WC, ten Cate OTJ. What is feedback in clinical education? Med Educ. 2008;42(2):189-197. https://doi.org/10.1111/j.1365-2923.2007.02973.x
  22. Milan FB, Parish SJ, Reichgott MJ. A model for educational feedback based on clinical communication skills strategies: beyond the “feedback sandwich.” Teach Learn Med. 2006;18(1):42-47. https://doi.org/10.1207/s15328015tlm1801_9
  23. Rollnick S, Miller WR, Butler CC. Motivational Interviewing in Health Care: Helping Patients Change Behavior. New York, NY: Guilford Press; 2008.
  24. Prochaska JO, Velicer WF. The transtheoretical model of health behavior change. Am J Health Promot. 1997;12(1):38-48. https://doi.org/10.4278/0890-1171-12.1.38


Ridinger H, Cvengros J, Gunn J, et al. Struggling medical learners: a competency-based approach to improving performance. MedEdPORTAL. 2018;14:10739. https://doi.org/10.15766/mep_2374-8265.10739

Received: February 28, 2018

Accepted: July 12, 2018