Introduction: Learners in high-performing contexts such as medical school and residency are presumed to have appropriate study skills to be successful. However, for those learners in academic difficulty who are identified as having weak study skills and poor test taking skills, faculty need tools to use to lead these struggling learners to academic success. In coaching learners on study skills, we frequently found that the study skills that helped them get into medical school or residency were no longer sufficient to make them successful in their new program. Given that there are multiple study strategies available, faculty coaches need mechanisms to first tease out which skills are the issue and then provide targeted strategies specific to each learner. Methods: In meeting with a faculty coach, learners are briefly interviewed, complete a self-assessment to explore all possible root weaknesses in their study skills, and then read strategic solutions and review with faculty how they may be implemented. This tool has been offered to 52 students, 76 residents, and 20 fellows and faculty between 2010 and 2015. Results: One hundred forty-eight individuals participated in this innovation, with more than 91% of all individuals going on to pass the exam that they had either failed or, in the case of the in-training exam, scored below the 30th percentile on. Conclusion: A self-assessment tool is key to individualized insight and action plans for improving study skills. Implementation must be supported with concurrent in-person coaching.
- Review the failing learner’s study habits and test taking patterns.
- Identify and understand which of these habits are maladaptive.
- Provide and implement new study skills and test taking strategies specific to the learner’s deficits.
- Evaluate the outcome of these methods through repeat-exam pass rates.
Medical students, residents, and fellows are presumed to have appropriate study skills to be successful in their programs. Nevertheless, for learners in academic difficulty (not resulting from personal life stress or mental or physical problems, etc.), helping them overcome these weaknesses requires identification of the deficits.
In 2006, the University of Colorado School of Medicine developed a centralized remediation program for all medical student, residency, and fellowship training programs on campus, as well as faculty. Learners were referred via a number of different methods. Medical students were referred by their clerkship director because of danger of failing a rotation, having already failed a rotation, or having received repetitive negative comments on their rotation evaluations. Residents and fellows were referred by their program director because of danger of failing a rotation or because they had failed a rotation. Individuals who were no longer in good academic standing or had been placed on a letter of warning or focused review were also referred. Learners could also self-refer to the remediation program. Once referred, each learner was interviewed, his or her academic record was reviewed, and assessments were performed to diagnose the areas of learner difficulty. Following this assessment and data gathered from the program, it was learned that over one-third of the referred trainees and faculty struggled with medical knowledge most often assessed through standardized exams.1
Review of the literature, including MedEdPORTAL, located articles reporting how poor exam performance predicted future struggles with exams.2 However, strategies and tools for improving performance on medical school and residency multiple-choice standardized exams were not found. Consequently, the remediation team identified the need for an instructional tool for faculty to support the remediation of learners with suboptimal study habits and test taking skills, one that incorporated basic study skills and adult learning theory.3,4 To be learner friendly, the tool needed to list the most common struggles among learners and specify potential strategies for improvement. More importantly, the tool would be structured to allow for self-assessment and worded to promote exploration of the issues, in contrast to the prescriptive approach of most study skills resources. Starting with these fundamental needs, the tool was built iteratively, with pilot testing among the authors to allow the designer to incorporate the years of experience faculty had accrued interviewing, coaching, and teaching struggling medical learners and observing them learn to achieve on standardized tests.
The target audience for this study skills guidance tool includes faculty coaching medical students, residents, fellows, and faculty peers regardless of specialty. The tool addresses general strategies and is not specific to discipline or medical content. No prerequisite knowledge or skills are required. Those who have access to the learner’s prior exam performance scores and histograms or have insight into struggles with test taking (either through self-assessment or through feedback from faculty) will likely derive more benefit from this tool. Ideally, the tool will be implemented months prior to an anticipated standardized exam and up to a year prior to board certification examinations.
Identification and refinement of study skills list: From 2006-2010 at the University of Colorado School of Medicine, medical students who failed National Board of Medical Examiners (NBME) subject exams or the United States Medical Licensing Exam (USMLE) Step 1 or Step 2 CK exams, residents and fellows who scored below the 30th percentile on their specialty in-training exams, and fellows and faculty who failed their written board certification exams were referred to a remediation specialist. No learners were referred who did not meet these explicit criteria, though some students self-referred after failing an exam. The remediation specialist was a physician with expertise in medical education and remedial teaching. All of the failed exams were multiple-choice ones.
The referred learners met individually with the remediation specialist for 1 hour and were asked about prior standardized test performance, if English was their second language, about resources used prior to the failed exam, about the number of practice questions completed, and about any unusual or unexpected events that may have impacted their test performance. Each learner was then asked to make observations about his or her study skills and test taking strategies and to identify areas of struggle. Exam score reports and available histograms were reviewed to identify areas of weakness. A list of common struggles was developed, expanded, and then consolidated based on theme (Appendix A).
Matching strategies with identified study and test taking concerns: During the same 5-year period, specific techniques were developed to address each of the above concerns, based on the literature.4-12 These techniques were iteratively adapted to the needs of medical learners in higher education based on their reported experience and ease of implementation. Appendix B matches the list of common struggles in Appendix A with recommended targeted strategies for coaching medical learners who want to improve their study and test taking skills and, therefore, their test scores. For example, if a learner acknowledges the following struggle, “I can’t decide where I should study,” the matched strategy suggests that the learner should take practice tests and complete practice questions in an environment that simulates the testing environment. In this particular example, a practical application of this strategy would be to suggest that the learner study in a coffeehouse because there is minimal background noise and few people moving around, similar to testing environments.
Often, in remediation, learners have multiple issues. Learners were asked to select all that applied so that multiple strategies could be employed to address each of the impacting issues. All of the strategies are designed to be self-directed, but struggling learners will be best served by having a coach to help them implement the strategies.
Deploying the tool involves a five- to six-step process. Please note, steps 2-5 of implementing the tool take in total approximately 60 minutes. If faculty are not available or if the learner chooses, this tool can be used independently by the learner alone. The tool is implemented as follows:
- Identify learners in need of remediation of study skills and test taking strategies. Consider the following criteria: medical students who failed NBME subject exams or the USMLE Step exams, residents and fellows who scored below the 30th percentile on their in-training exams, and fellows and faculty who failed their board certification exams.
- A faculty coach meets with each learner for a brief interview to encourage self-reflection and to assess both learner prior experience and perception of test performance (see Appendix C).
- After the interview, Appendix A is given to the learner. The faculty coach asks the learner to choose the struggles that resulted in exam failure from the list of common problems. The faculty coach can also identify learner-specific struggles that the learner may not have indicated.
- The learner then reads and reviews the learning strategies in Appendix B that correspond to the identified common problems.
- The learner and faculty coach then explore together how to implement these strategies.
- [Optional:] The faculty coach follows up with the learner periodically to assess study progress and subsequent exam performance. Frequency varies based on how often exams are given. For board exams offered yearly, meetings usually occur every 4 months. For monthly exams, meetings occur every 2 weeks.
The tool has been implemented for remediating learners in a variety of contexts/settings and across a variety of multiple-choice exams issued by different organizations (e.g., NBME, different medical specialty board exams) where passing scores and percentiles vary by exam. For this reason, only passage and failure of repeat exams were measured. The repeat exams occurred over different time spans ranging from 1 month to 2 years after the strategies were implemented, depending on the frequency of exam administration. Additional information collected included stage of training (e.g., student, resident, and fellows/faculty). Fellows and faculty were combined to preserve anonymity. Individuals were asked to email with follow-up exam scores and comments, which were verified by the corresponding departments.
Data Collection and Analysis
Data were analyzed by tallying the number of learners at each stage of training and the number of exams taken and then calculating the percentage who passed. The number and training of faculty who have used this resource were also included. Subset analysis was limited by power. Unsolicited individual participant comments were collected via email correspondence and grouped based on theme. This study was determined to be exempt by the Colorado Multiple Institutional Review Board.
At the University of Colorado School of Medicine, from 2010-2015, 148 individuals participated in this innovation: 52 (35.1%) medical students, 76 (51.4%) residents, and 20 (13.5%) fellows and faculty. All individuals had either failed an exam or scored below the 30th percentile prior to participation.1 Faculty spent an average of 10 hours with each learner, with an interquartile range of 2-30 hours. If learners did not demonstrate progress by 10 hours, success was less likely. More than 91% of all individuals went on to pass the exam that they had either failed or, in the case of the in-training exam, scored below the 30th percentile on. For a detailed overview of student characteristics, please see the Table. A total of five faculty members have implemented this innovation with learners. These faculty members consist of a physician remediation specialist, an educational psychologist, a physician faculty member with interest in test taking skills and expertise, a physician assistant, and a faculty member interested in supporting learners.
|Level of Training||N (%)||Passed n (%)||Failed n (%)|
|Medical students||52 (35.1)||50 (96.2)||2 (3.8|
|Residents||76 (51.4)||67 (88.2)||9 (11.8)|
|Fellows/faculty||20 (13.5)||18 (90.0)||2 (10.0)|
|Total||148 (100)||135 (91.2)||13 (8.8)|
Participants were asked to provide open-ended comments on the tool. Their comments expressed the following themes: appreciation for the attention to their test taking needs, ease of use and accessibility of the tool, appreciation for the strategies, and improved overall study habits and performance on other exams.
The frequency of failures on exams and poor medical knowledge in medical school and residency speak to an unanswered need.1 Furthermore, local needs can drive innovation of existing resources into newer, usable formats. The present educational innovation was designed based on a need observed by the remediation team at the University of Colorado School of Medicine, particularly as students, residents, and fellows were being referred to remediation in increasing numbers. At the same time, the team heard from learners that they had assumed they already possessed these skills since they had progressed this far into their careers. As a result, a tool was required that would be accessible to learners, where accessibility also accounted for psychological or social discomfort with being identified as remedial in a high-performing educational context. Given these concerns, a scalable remediation resource to provide study skills guidance was needed.
Notably, study skills resources are difficult to find in the health professions literature, and data about study skills being taught in medical school and residency are sparse. To be valid and reliable, any study skills tool also needed to draw on existing evidence-informed study skills solutions. As innovation is a developmental process, the present tool was iteratively developed by field-testing evidence-informed study strategies with the target population (learners requiring remediation in a medical education setting). This tool was found to be feasible by faculty members and learners, who expressed appreciation for the strategies and techniques.
The only outcome measured was subsequent repeat-test performance. While a pass rate higher than 90% was demonstrated, a control group for comparison was not identified. Finding a suitable comparison group presents a challenge as it seems unethical to randomize failing learners to remedial teaching while strategies are being developed. The number of different exams, the evolution of exams, and the development of the innovation over a decade limit the use of historical controls. In addition, recommendations for study strategies and test taking skills were offered in groups of strategies chosen based on learner needs, limiting the ability to study individual interventions. As more resources for remedial teaching emerge, a comparison of more vigorous studies and the development of more evidence-based techniques may be conducted, and further revisions of this innovation will be possible.
Of note, participants who failed the exam a second time were referred for higher-level study skills and test taking preparation, which included exam preparation courses, neuropsychiatric testing for learning disorders, and practicing questions with remediation faculty directly. In general, several faculty characteristics may have contributed to the success of the program. Such characteristics include (1) familiarity with the tool, (2) approachability, (3) calm and patient demeanor, (4) availability that matched learners’ exam schedules, and (5) willingness to work with struggling learners and invest in their success. Content expertise was not required to utilize the tool and coach learners. There does not appear to have been a difference in faculty knowledge and skill and learner performance.
None to report.
None to report.
This publication contains data obtained from human subjects and received ethical approval.
- Guerrasio J, Garrity MJ, Aagaard EM. Learner deficits and academic outcomes of medical students, residents, fellows, and attending physicians referred to a remediation program, 2006–2012. Acad Med. 2014;89(2):352-358. https://doi.org/10.1097/ACM.0000000000000122
- Hamdy H, Prasad K, Anderson MB, et al. BEME systemic review: predictive values of measurements obtained in medical schools and future performance in medical practice. Med Teach. 2006;28(2):103-116. https://doi.org/10.1080/01421590600622723
- Kalet A, Chou CL, eds. Remediation in Medical Education: A Mid-Course Correction. New York, NY: Springer; 2014.
- Merriam SB, Caffarella RS, Baumgartner LM. Learning in Adulthood: A Comprehensive Guide. San Francisco, CA: Jossey-Bass; 2007.
- Cleland J, Arnold R, Chesser A. Failing finals is often a surprise for the student but not the teacher: identifying difficulties and supporting students with academic difficulties. Med Teach. 2005;27(6):504-508. https://doi.org/10.1080/01421590500156269
- Mattick K, Knight L. High-quality learning: harder to achieve than we think? Med Educ. 2007;41(7):638-644. https://doi.org/10.1111/j.1365-2923.2007.02783.x
- Hembree R. Correlates, causes, effects, and treatment of test anxiety. Rev Educ Res. 1988;58(1):47-77. https://doi.org/10.3102/00346543058001047
- Laatsch L. Evaluation and treatment of students with difficulties passing the Step examinations. Acad Med. 2009;84(5):677-683. https://doi.org/10.1097/ACM.0b013e31819faae1
- Coumarbatch J, Robinson L, Thomas R, Bridge PD. Strategies for identifying students at risk for USMLE Step 1 failure. Fam Med. 2010;42(2):105-110.
- Karpicke JD. Retrieval-based learning: active retrieval promotes meaningful learning. Cur Dir Psychol Sci. 2012;21(3):157-163. https://doi.org/10.1177/0963721412443552
- Seibel H, Guyer K, Mangum AB, et al. Barron’s How to Prepare for the MCAT Medical College Admission Test. 10th ed. Hauppauge, NY: Barron’s Educational Series; 2006.
- Rothstein R, ed. Kaplan MCAT Comprehensive Review. 7th ed. New York, NY: Kaplan; 2003.
This is an open-access publication distributed under the terms of the Creative Commons Attribution-NonCommercial-Share Alike license.
Received: January 17, 2017
Accepted: May 19, 2017