Original Publication
Open Access

Longitudinal Faculty Development Program to Promote Effective Observation and Feedback Skills in Direct Clinical Observation

Published: October 30, 2017 | 10.15766/mep_2374-8265.10648


  • Instructor-Tutor Guide for Session 1.doc
  • Instructor-Tutor Guide for Session 2.docx
  • Instructor-Tutor Guide for Session 3.doc
  • Script for Resident Physician-Patient Demonstration.doc
  • Qualities of Good Feedback Pocket Card.pdf
  • Montefiore DCO Instrument.pdf
  • Didactic Introduction - Session 2.ppt
  • Role-Play Guide.doc
  • Resident Program Evaluation Survey.pdf
  • Postsession Faculty Program Evaluation Survey.doc
  • Session 1 Trigger Video.mp4

All appendices are peer reviewed as integral parts of the Original Publication.

To view all publication components, extract (i.e., unzip) them from the downloaded .zip file.


Introduction: We developed a longitudinal faculty development program to maximize faculty training in direct clinical observation (DCO) and feedback, as there was a perceived need for higher quality of DCO and feedback. To achieve this, we created a behaviorally anchored DCO instrument and worked to improve faculty skills in this area. Methods: We describe an innovative model of faculty training that is learner centered and reinforces evidence-based principles of effective feedback that are introduced and then repeated in all sessions. The training centers on both peer-led observation of and feedback on faculty learners’ recorded DCO feedback encounters, and is guided by our DCO instrument. Residents and faculty completed surveys to assess program impact. Qualitative responses were analyzed for themes. The Wilcoxon signed rank test was used to examine significance of difference in feedback quality before and after DCO faculty development education sessions. Results: Our faculty development program has been well received and had a significant impact on quality of faculty feedback, as rated by resident learners. Discussion: Our faculty development model is effective at growing faculty learners’ DCO and feedback skills. Potential strengths of this program include the use of a behaviorally anchored DCO instrument, longitudinal and experiential faculty development, and use of small-group peer review of recorded faculty feedback encounters. We have found that when their learning needs are attended to, faculty learners cultivate a deep appreciation for principles of effective feedback. In fact, faculty feedback skills can be enhanced in the eyes of resident learners.

Educational Objectives

After this longitudinal faculty development program, faculty learners will be able to:

  1. Recognize effective communication skills.
  2. Demonstrate behaviorally specific, learner-centered feedback skills.
  3. Guide house staff to accurately self-evaluate their clinical skills.
  4. Give feedback after using the direct clinical observation instrument by means of a shared mental model.
  5. Discuss effective strategies for managing emotions that arise in a feedback encounter.


Feedback is the cornerstone of professional clinical skill development, but the literature indicates that there are deficits in the quantity and quality of feedback in most medical education programs.1,2 Improving faculty feedback behavior requires significant changes at both faculty and administrative levels in health care education. To ensure effectiveness of such feedback programs, collection of real-time feedback performance data is necessary.2 In addition, it has been shown that reflection on clinical teaching can advance faculty self-awareness of teaching skills and can motivate behavior change.3-7

The feedback literature reveals that faculty training is critical to promoting resident self-assessment and the effective use of the mini-Clinical Evaluation Exercise (mini-CEX),7-11 which was the standard direct clinical observation (DCO) instrument for many years. While the mini-CEX possesses strong psychometric qualities, the traditional mini-CEX instrument has been shown to be poor at facilitating feedback and does not include an action plan.12,13 Furthermore, the use of a DCO instrument with behavioral anchors is key to increasing interrater reliability and facilitating feedback giving.14-17 There is a paucity of published educational outcomes data examining the effectiveness of faculty training in DCO and feedback, and few published descriptions of faculty development programs for feedback skills.10,12,18 In addition, it has been shown that review of recorded teaching encounters can transform feedback practice.19-21 There is a clear and perceived need for faculty training in these skills given medical education’s movement towards the use of milestones and Entrustable Professional Activities.22,23

The authors conducted a search of MedEdPORTAL and found the following: The search term feedback and direct observation yielded one relevant publication describing a DCO instrument for use with observing residents24 and two describing relevant programs for medical students.25,26 A search on direct clinical observation and faculty development yielded no results, as did direct observation and faculty development. Finally, under feedback and faculty development, there were two relevant curricula focusing on feedback skills for residents25,27 and one for students.26 We believe that our curricular materials and instrument add to the MedEdPORTAL literature as the first publication of its kind to describe an outcomes-based longitudinal faculty development program on observation and feedback.

In light of all of these gaps, we first conducted a needs assessment to ascertain what faculty and residents thought about the state of DCO and feedback in our internal medicine residency program at Montefiore Medical Center in Bronx, NY. Based on the responses of 189 internal medicine faculty and 88 residents (between 2009-2010), we found several areas listed as important to the residents and faculty relating to quality of feedback in our residency’s DCO program, including the need to: (1) build a collaborative, learner-centered approach to feedback, (2) manage emotions surrounding the feedback conversation, (3) increase faculty’s knowledge base of the behavioral medicine literature, and (4) protect time for feedback in our program.

We created a group process for faculty to develop a shared mental model of the clinical skills expected of resident learners. Using the mini-CEX instrument as a template, the group worked together to create behavioral anchors and domains derived from a literature review of patient-centered care models and decided upon by the consensus of the authors of this publication. Working-group members who could build on lessons learned and coach local faculty were chosen to be leaders at the residents’ ambulatory clinical teaching sites. We see this working-group concept as integral to initial faculty development efforts and as essential to the program’s success. The DCO instrument was then vetted during two sessions with core faculty. Recommended revisions were integrated into the form. Program evaluation and learner assessment strategies were formulated. The program was approved by the local institutional review board (approved June 16, 2010, by Montefiore Medical Center in Bronx, NY).

Next, we designed a longitudinal faculty development program on how to use a revised DCO instrument to observe residents’ clinical practice and give real-time feedback. Our revised DCO instrument aims to facilitate behaviorally specific, learner-centered feedback with an emphasis on creating an individualized learning plan. Our faculty development program aims to improve faculty observation and feedback skills by showcasing self-selected, recorded excerpts of individual faculty members’ feedback chosen based upon their learning goals. The curriculum is designed to address the themes that emerged in the needs assessment. Expert faculty facilitators led these sessions with the goal of promoting faculty teaching skills.

Finally, we evaluated the sessions and assessed the training program’s effectiveness when applied in practice. At the conclusion of each faculty development session, program evaluation surveys were distributed to each faculty learner. Residents who completed DCOs in the months following each training session also received surveys to assess faculty feedback quality.


The faculty development program consisted of three sessions. Session 1 had a set agenda including didactics and rater training for a DCO instrument. Sessions 2 and 3 were experiential. Following the sessions, we examined the impact of the faculty development program on faculty feedback quality. Of note, further experiential sessions can be added as per individual program needs and resources.

Longitudinal Faculty Development
The longitudinal faculty development program included three sessions at local ambulatory teaching sites. Ideally, all sessions were overseen by the same faculty facilitator(s) who preferably had a background in physician-patient communication, and both DCO and feedback skills. Typical sessions included no more than 15 learners to promote rapport and trust among group members. The advised length of each session was around 60 minutes, and an appropriate learning environment was a room conducive to small-group discussion.

Educational methodology: Sessions 1 and 2 focused on rater training to optimize use of our DCO instrument. We used frame of reference and rater error training methodology.11,28 Session 3 utilized reviews of preselected clips of recorded feedback encounters in the same small-group settings to build relevance and meet faculty learners’ specific needs.

Session 1. Didactic and Rater Training (Appendix A): This session’s learning objectives are to practice skills involving: (1) effective use of the Montefiore DCO instrument to guide feedback, (2) identification of learners’ personal goals and emotional needs, (3) formulation and efficiently delivery of learner-centered feedback, and (4) awareness about learners’ own feedback-giving styles.

The session opened with a didactic introduction to the DCO and feedback literature, and a framework for expectations of the DCO program. Then, the Montefiore DCO instrument (Appendix F) was then introduced and outlined, and information was provided regarding its effective use. Faculty learners completed rater training as per the teaching methodology outlined in Appendix A. They were instructed to use the Montefiore DCO instrument to review and assess a resident’s performance captured on a trigger tape (Appendix K). Session materials included the Montefiore DCO instrument and a computer with projector capabilities. Appendix A describes the Session 1 in more detail, and the time line below highlights the session timing and content:

  • Introductions, review of goals and objectives: 10 minutes.
  • Introduce format of training session, orient to video observation, and describe content of Montefiore DCO instrument (Appendix F): 5 minutes.
  • Group viewing of trigger tape (Appendix K): 5 minutes.
  • Faculty participants complete Montefiore DCO instrument: 2 minutes.
  • Facilitated discussion in which individuals discuss ratings potentially reaching consensus on how to grade: 40 minutes.
  • Wrap-up, plans for future sessions, and questions: 5 minutes.

Overview of Sessions 2 and 3 (Appendices B & C): These sessions’ learning objectives centered on building faculty learners’ confidence in their ability to: (1) demonstrate effective clinical observation skills, (2) choose specific areas on which to focus feedback discussion, and (3) incorporate their own feedback-giving style.

Session 2. Experiential Rater Training via Role-Play (Appendix B): This session opened with a demonstration of a resident physician-patient interaction (Appendix D). This interaction highlighted a case of a resident’s ambulatory clinical encounter that was lacking insight regarding communication skills. This scenario was chosen as it posed a common and significant educational challenge. Of note, one could choose different common scenarios. For example, a resident who seems resistant to receiving feedback, is poorly efficient, or offers disorganized presentations that are lacking in demonstrations of effective clinical reasoning may also be used.

The group then divided into triads (faculty, resident, and observer), and each triad practiced a role-play of faculty feedback to this resident physician on the demonstrated resident physician-patient ambulatory clinical encounter. The triad members then delivered feedback to one another. The role-play was then repeated three times to give each person a chance to practice receiving and delivering feedback. The Montefiore DCO instrument (Appendix F) was used to guide feedback by prompting the observer to focus his/her feedback on one to two specific skills or behaviors. The Qualities of Good Feedback Pocket Card (Appendix E) prompted learners to practice using advanced communication skills while giving feedback, including empathic skills. A role-play guide (Appendix H) outlined faculty, resident, and observer roles, as well as the flow and timing of role-play and feedback. Following the role-plays, the entire group debriefed about lessons learned during the role-play and in feedback, and the facilitator briefly provided a summary. As stated above, session materials include the Qualities of Good Feedback Pocket Card, Montefiore DCO instrument, and role-play guide (Appendices E, F, & H, respectively). Appendix B describes Session 2 in more detail, and the time line below highlights the session timing and content:

  • Introduction to feedback and DCO literature (Appendix G): 10 minutes.
  • Demonstration of resident physician-patient interaction and setup for skills training (Appendix D): 2 minutes.
  • Feedback skills training in triads (faculty, resident, observer) for three consecutive rounds of 5 minutes of enactment followed by 2 minutes of feedback: 21 minutes.
  • Large-group debriefing: 15 minutes.
  • Wrap-up: 5 minutes.

Session 3. Experiential Rater Training via Video Review and Peer Feedback (Appendix C): Our approach for this session was to view and reflect on faculty feedback conversations regarding ambulatory clinical encounters as demonstrated in digital video clips of feedback encounters from individual faculty learners (with resident and patient consent). These sessions could include a predetermined video, demonstrated feedback encounter, or a small-group role-play or fishbowl. The session’s success depended on faculty learners preparing their learning goals for the session.

In our experience, in advance of Session 3 and all experiential learning sessions thereafter, one volunteer faculty learner and one resident physician volunteer should be recorded. The session facilitators recorded this volunteer faculty learner giving feedback on a DCO to a resident in the ambulatory clinical setting. The session facilitators then elicited the volunteer faculty learner’s assessment of his/her own skills and goals for the upcoming faculty development session. Facilitators also selected short video clips pertaining to the volunteer faculty learner’s learning goals for the session.

Session 3 opened with the facilitator setting ground rules for respectful communication and outlining the volunteer faculty learner’s learning goals. The other faculty learners were given copies of the Montefiore DCO instrument and Qualities of Good Feedback Pocket Card. They were instructed to use the Montefiore DCO instrument to note their observations of the recorded feedback encounter, and prepare feedback for the volunteer faculty learner based on his/her learning goals. The group then watched the video clips and took 2-3 minutes to prepare feedback. A facilitated group discussion about best practices and feedback to the volunteer faculty learner followed, using the Pocket Card.

This session can be replicated on an ongoing basis, as per local needs and as resources permit. In our clinical setting, we conducted these sessions with several volunteer faculty learners’ recorded feedback encounters, and our goal was for all faculty learners to showcase a feedback encounter. These sessions were also conducted at individual ambulatory clinical teaching sites so as to promote maximal rapport, as well as respectful communication and confidentiality. A DVD player, computer, and the DVD/digital videos of videotaped encounters are used with a projector. Appendix C describes Session 3 in more detail, and the time line below highlights the session timing and content:

  • Introduction: 2 minutes.
  • Setup of video segments and format of session: 5 minutes.
  • Observe resident physician-patient encounter, rate and discuss resident performance: 10 minutes.
  • Observe, rate, and discuss faculty feedback interaction: 30 minutes.
  • Wrap-up: 5 minutes.

Program evaluation: At the conclusion of each faculty development session, program evaluation surveys were distributed to each faculty learner (Appendix J). To assess the impact of the program on residents’ perception of faculty feedback, residents who completed DCOs in the months following each training session were also sent surveys (via surveymonkey.com) following each of their DCOs (Appendix I).

Program evaluation statistical analysis: To assess the impact of faculty development sessions on faculty learners’ subsequent DCO feedback quality, we surveyed residents using a retrospective pre-post design (Appendix I). Chi-square analysis was used to examine associations between faculty and resident baseline characteristics measured on ordinal scales. A Wilcoxon signed rank test was used to determine the significance of difference in intervention feedback quality before and after the DCO education session.

Program evaluation qualitative analysis: Two of the authors (Sheira Schlair and Lawrence Dyche) analyzed the responses to qualitative questions on the resident and faculty surveys (Appendices I & J) for common themes. We asked one open-ended question on the resident survey assessing lessons learned in the DCO, and two qualitative questions on the faculty program evaluation survey assessing program feedback.


Our faculty development program includes two faculty instructors with backgrounds in educational methods and theory as well as advanced training in communication and facilitation skills. One facilitator is a psychiatric social worker, and the other an academic general internist with a master’s degree in medical education. Both have completed advanced facilitation skills training. Program evaluation data have been analyzed for 18 faculty learners who were junior to mid-career faculty members in ambulatory general internal medicine (mean years in practice = 5). Ninety percent were full time. The faculty found each of the sessions to be helpful and stimulating.

During the first year of the program in which data collection took place (2011-2012), we delivered faculty development to 36 faculty learners, 18 of whom completed all sessions and had DCOs following each session for which resident surveys were successfully completed. Eight faculty learners attended one session, five attended two sessions, and five attended all sessions but did not have the Montefiore DCO instrument and/or resident program evaluation survey completed successfully.

During the first year of the program, 62 residents completed 104 DCOs. Resident survey respondents by year were 33% PGY 1, 45% PGY 2, and 22% PGY 3, all based at an urban academic internal medicine residency program. Feedback behaviors were assessed on a 4-point Likert-type scale (1 = not able, 2 = somewhat able, 3 = very able, 4 = extremely able). Overall, resident learners reported a significant improvement in all areas of faculty feedback (see Table). On DCOs conducted by faculty learners following the faculty development exercise, the most significant improvements were in the focus (presession 2.85 vs. postsession 3.38; p = .006), timeliness (presession 2.77 vs. postsession 3.33; p = .003), learner centeredness (presession 2.70 vs. postsession 3.33; p = .005), and emotional sensitivity of faculty feedback (presession 2.82 vs. postsession 3.18; p = .01). In fact, these were all of the areas of concern cited in the initial Montefiore Medical Center Department of Medicine needs assessment of faculty and residents on the quality of directly observed clinical encounter feedback. The richest qualitative data emerged from the resident survey responses to the question “Please describe what lessons learned in this DCO you will apply to your clinical practice” (Appendix I). The most frequently cited themes included desires to slow down the pace, to prepare for the visit, to set an agenda that was negotiated with the patient, to work on motivational interviewing skills, and to learn how to focus the visit.

Table. Faculty View of Feedback Quality (Mean Scores on a
4-Point Likert-Type Scale, 1 = Not Able, 4 = Extremely Able)
Feedback Quality
Behavioral, not about personal characteristics
Balanced between positive and negative
Related to residents’ personal goals
Acknowledgment and addressing of emotions during feedback process
Beneficial for receiver
Effective knowledge of the doctor-patient communication literature
All ps < .05, except where indicated.
ap < .01.

Faculty program evaluation data (Appendix J) were unanimously positive, with 100% respondents stating that the goals were clearly stated and the time managed effectively. Faculty learners most frequently cited appreciation for the interactive and learner-centered nature of the content. In particular, they commented on the powerful impact of the role-play feedback skills practice as well as the transformative power of viewing and giving feedback on their colleagues’ recorded feedback conversations. There were no specific areas for improvement noted on the faculty program evaluation surveys outside of the fact that several faculty learners requested more of these training sessions. To date, nearly 100 faculty have completed this longitudinal training program, and the Montefiore DCO instrument is now used at all ambulatory teaching clinical sites on a regular basis.


This longitudinal faculty development program has been proven to effectively promote faculty confidence in using the Montefiore DCO instrument and to enhance faculty feedback delivery in our large, busy, urban academic internal medicine residency program. This has led to greater integration of DCO and feedback into ambulatory internal medicine teaching practices and more faculty and resident initiative to conduct DCOs and solicit feedback. This program and the Montefiore DCO instrument also facilitate documentation of resident DCO and feedback to meet residency review committee and ACGME requirements. This model has been successful because the curriculum is efficient, low maintenance, portable/deliverable at local clinical sites (eliminating faculty learner commute time), and has stable course facilitators who are well versed in physician-patient communication and both DCO and feedback skills. Additionally, despite given limitations and the small number of DCOs in our study sample (still with very significant results), the program likely should be reproducible and similarly impactful in other settings. The curriculum and the Montefiore DCO instrument can be easily adapted to train faculty learners doing DCO across the undergraduate and graduate medical education continuum.

Although the pre-post survey study design has some advantages, namely convenience, this design has a significant limitation. Notably, this drawback was likely recall bias regarding the learner’s perception of faculty’s quality of feedback on his/her prior DCO. However, we believe that this was the best method to avoid survey fatigue, mitigate house staff status as vulnerable subjects, and promote the highest response rate given multiple competing demands.

We found that faculty learners’ variable availability and competing responsibilities were often a significant barrier to this longitudinal training program and were certainly the major cause of our study’s high attrition rate. Our study sample size was also limited because faculty who participated in the faculty development series but either did not complete a DCO or did not receive a DCO feedback evaluation could not be included in the analysis. Our results likely demonstrate statistically significant improvement across all feedback areas as a result of our small sample size, and it is unclear whether implementation of this faculty development training model with a larger sample size would bear different results.

Our project was implemented in one of the largest internal medicine residency programs in the country, located in an urban academic ambulatory care setting. This setting produces many potential challenges. Having implemented it in this complex setting, we believe that this program can likely be implemented anywhere as it is not resource heavy. We think that this model of faculty development not only is generalizable to smaller, community-based residency programs and undergraduate ambulatory clinical teaching settings, but would also be easier to administer and have potentially greater impact in such settings. It remains to be seen whether similar results of this faculty development training program would be seen in learners in other settings. Furthermore, the process and content of our faculty development model and DCO instrument can be adapted across physician specialties where the residents spend time in ambulatory settings. However, the DCO instrument would need revisions for specialties with different workflows, such as surgery or anesthesia.

Any major pitfalls in successful implementation were avoided by offering faculty development sessions during lunchtime at local clinic venues, partnering with local and administrative champions (i.e., clinic faculty leader, medical directors, and chief residents) to cofacilitate sessions, and putting sufficient time into clarifying learning goals and editing DVDs/digital videos to find appropriate segments to view during Session 3.

We have learned that limited faculty time for preparation of the videotape reviews (prior to Session 3) presents a significant but surmountable logistical challenge. We also have found that digital recording via cellular telephone or digital recording device greatly facilitates the recording process. Conducting multiple training sessions at individual clinic sites also requires significant time commitments from the session facilitators and faculty learners. Perceived lack of time to deliver feedback1,28 has remained an ongoing barrier to promoting faculty’s initiative in conducting DCOs. Partnering with administration and clinic-based local champions is key to ensuring that faculty complete these observations and that residents request them. Furthermore, our experience in implementing this faculty development program has demonstrated that using the feedback encounter as an observation and teaching tool for faculty learners truly promotes learner centeredness. Buy-in to use this process assumes that a feedback-friendly work culture at least partially exists, and that this culture can be fostered through the intervention itself and with the aid of expert facilitation skills. Finally, to further the field of DCO and feedback, large-scale and outcomes-driven longitudinal educational interventions aimed at defining and promoting national standards of feedback quality are essential.

Author Information

  • Sheira Schlair, MD, MS: Associate Professor of Medicine, Department of Medicine, Montefiore Medical Center; Associate Professor of Medicine, Department of Medicine, Albert Einstein College of Medicine
  • Lawrence Dyche, MSW: Retired Faculty and Associate Professor of Medicine, Department of Family and Social Medicine, Montefiore Medical Center
  • Felise Milan, MD: Professor of Medicine, Department of Medicine, Montefiore Medical Center; Professor of Medicine, Department of Medicine, Albert Einstein College of Medicine

None to report.

None to report.

Informed Consent
All identifiable persons in this resource have granted their permission.

Ethical Approval
The Montefiore Medical Center Institutional Review Board approved this study.


  1. Archer JC. State of the science in health professional education: effective feedback. Med Educ. 2010;44(1):101-108. https://doi.org/10.1111/j.1365-2923.2009.03546.x
  2. Veloski J, Boex JR, Grasberger MJ, Evans A, Wolfson DB. Systematic review of the literature on assessment, feedback and physicians’ clinical performance: BEME Guide No. 7. Med Teach. 2006;28(2):117-128. https://doi.org/10.1080/01421590600622665
  3. Branch WT Jr, Paranjape A. Feedback and reflection: teaching methods for clinical settings. Acad Med. 2002;77(12, pt 1):1185-1188. https://doi.org/10.1097/00001888-200212000-00005
  4. Irby DM, Ramsey PG, Gillmore GM, Schaad D. Characteristics of effective clinical teachers of ambulatory care medicine. Acad Med. 1991;66(1):54-55. https://doi.org/10.1097/00001888-199101000-00017
  5. Sandars J. The use of reflection in medical education: AMEE Guide No. 44. Med Teach. 2009;31(8):685-695. https://doi.org/10.1080/01421590903050374
  6. Freese AR. Reframing one’s teaching: discovering our teacher selves through reflection and inquiry. Teach Teach Educ. 2006;22(1):100-119. https://doi.org/10.1016/j.tate.2005.07.003
  7. Kogan JR, Bellini LM, Shea JA. Have you had your feedback today? Acad Med. 2000;75(10):1041. https://doi.org/10.1097/00001888-200010000-00026
  8. Hauer KE, Holmboe ES, Kogan JR. Twelve tips for implementing tools for direct observation of medical trainees’ clinical skills during patient encounters. Med Teach. 2011;33(1):27-33. https://doi.org/10.3109/0142159X.2010.507710
  9. Holmboe ES, Sherbino J, Long DM, Swing SR, Frank JR; for International CBME Collaborators. The role of assessment in competency-based medical education. Med Teach. 2010;32(8):676-682. https://doi.org/10.3109/0142159X.2010.500704
  10. Donato AA, Pangaro L, Smith C, et al. Evaluation of a novel assessment form for observing medical residents: a randomised, controlled trial. Med Educ. 2008;42(12):1234-1242. https://doi.org/10.1111/j.1365-2923.2008.03230.x
  11. Holmboe ES, Ward DS, Reznick RK, et al. Faculty development in assessment: the missing link in competency-based medical education. Acad Med. 2011;86(4):460-467. https://doi.org/10.1097/ACM.0b013e31820cb2a7
  12. Kogan JR, Holmboe ES, Hauer KE. Tools for direct observation and assessment of clinical skills of medical trainees: a systematic review. JAMA. 2009;302(12):1316-1326. https://doi.org/10.1001/jama.2009.1365
  13. Hawkins RE, Margolis MJ, Durning SJ, Norcini JJ. Constructing a validity argument for the Mini-Clinical Evaluation Exercise: a review of the research. Acad Med. 2010;85(9):1453-1461. https://doi.org/10.1097/ACM.0b013e3181eac3e6
  14. Alves de Lima A, Conde D, Costabel J, Corso J, Van der Vleuten C. A laboratory study on the reliability estimations of the mini-CEX. Adv Health Sci Educ Theory Pract. 2013;18(1):5-13. https://doi.org/10.1007/s10459-011-9343-y
  15. Crossley J, Johnson G, Booth J, Wade W. Good questions, good answers: construct alignment improves the performance of workplace-based assessment scales. Med Educ. 2011;45(6):560-569. https://doi.org/10.1111/j.1365-2923.2010.03913.x
  16. ten Cate O. Trust, competence, and the supervisor’s role in postgraduate training. BMJ. 2006;333(7571):748-751. https://doi.org/10.1136/bmj.38938.407569.94
  17. Halman S, Dudek N, Wood T, et al. Direct Observation of Clinical Skills Feedback Scale: development and validity evidence. Teach Learn Med. 2016;28(4):385-394. https://doi.org/10.1080/10401334.2016.1186552
  18. Steinert Y, Mann K, Centeno A, et al. A systematic review of faculty development initiatives designed to improve teaching effectiveness in medical education: BEME Guide No. 8. Med Teach. 2006;28(6):497-526. https://doi.org/10.1080/01421590600902976
  19. Parish SJ, Weber CM, Steiner-Grossman P, Milan FB, Burton WB, Marantz PR. APPLIED RESEARCH: teaching clinical skills through videotape review: a randomized trial of group versus individual reviews. Teach Learn Med. 2006;18(2):92-98. https://doi.org/10.1207/s15328015tlm1802_1
  20. Wilkerson LA, Irby D. Strategies for effecting change in teaching practices: a review of current models. In: Scherpbier AJJA, van der Vleuten CRM, Rethans JJ, van der Steeg AFW, eds. Advances in Medical Education. Maasthricht, the Netherlands: Springer; 1997:23-31.
  21. Skeff KM, Berman J, Stratos G. A review of clinical teaching improvement methods and a theoretical framework for their evaluation. In: Edwards JC, Marier RL, eds. Clinical Teaching for Medical Residents: Roles, Techniques, and Programs. New York, NY: Springer; 1988:92-120.
  22. Hawkins RE, Welcher CM, Holmboe ES, et al. Implementation of competency-based medical education: are we addressing the concerns and challenges? Med Educ. 2015;49(11):1086-1102. https://doi.org/10.1111/medu.12831
  23. Holmboe ES, Yamazaki K, Edgar L, et al. Reflections on the first 2 years of milestone implementation. J Grad Med Educ. 2015;7(3):506-512. https://doi.org/10.4300/JGME-07-03-43
  24. Swan R, Gigante J. Direct observation in an outpatient clinic: a new easier tool. MedEdPORTAL. 2010;6:7901. https://doi.org/10.15766/mep_2374-8265.7901
  25. Lamba S, Nagurka R. Tool for documenting clinical point-of-care direct observation and formative feedback. MedEdPORTAL. 2015;11:10093. https://doi.org/10.15766/mep_2374-8265.10093
  26. Schiller J, Hammoud M, Belmonte D, et al. Systematic direct observation of clinical skills in the clinical year. MedEdPORTAL. 2014;10:9712. https://doi.org/10.15766/mep_2374-8265.9712
  27. Nikels SM, Brandenburg S. Multi-source evaluation of resident physicians. MedEdPORTAL. 2012;8:9249. https://doi.org/10.15766/mep_2374-8265.9249
  28. Kogan JR, Hess BJ, Conforti LN, Holmboe ES. What drives faculty ratings of residents’ clinical skills? The impact of faculty’s own clinical skills. Acad Med. 2010;85(10)(suppl):S25-S28. https://doi.org/10.1097/ACM.0b013e3181ed1aa3


Schlair S, Dyche L, Milan F. Longitudinal faculty development program to promote effective observation and feedback skills in direct clinical observation. MedEdPORTAL. 2017;13:10648. https://doi.org/10.15766/mep_2374-8265.10648

Received: May 27, 2017

Accepted: October 9, 2017