Original Publication
Open Access

Developing Best Clinical Practices Through Outcomes Improvement: An Ongoing Quality Improvement Curriculum for Faculty and Residents

Published: February 6, 2018 | 10.15766/mep_2374-8265.10676

Appendices

  • Introductory Presentation.pptx
  • Proposal Template.docx
  • Oral Presentation Template.docx
  • Proposal Evaluation Template.pptx
  • Live Forum Evaulation Template.docx

All appendices are peer reviewed as integral parts of the Original Publication.

To view all publication components, extract (i.e., unzip) them from the downloaded .zip file.


Abstract

Introduction: Practice patterns in clinical learning environments are an important predictor of the patient care quality that residents will deliver after training. The Accreditation Council for Graduate Medical Education (ACGME) Clinical Learning Environment Review Evaluation Committee reported that from 2012-2015, residents and fellows rarely engaged in quality improvement (QI) activities. A QI curriculum was created for OB-GYN faculty and trainees to develop and implement best practices and study the resulting improvement in patient outcomes. Methods: Educational leadership in the Dell Medical School Department of Women’s Health designed a five-stage curriculum: (1) learning module describing the curriculum’s rationale, (2) clinical practice proposal development, (3) implementation/data analysis for selected proposals, (4) dissemination of proposals and outcomes during a live forum, and (5) evaluation. PGY1 and PGY4 OB-GYN residents collaborated in dyads with selected faculty mentors to draft evidence-based proposals. Dyads identified suggested outcomes measures to be analyzed postimplementation. Remaining faculty analyzed outcomes from the previous year’s proposals with PGY2 and PGY3 OB-GYN residents. Results: Forum participants, including faculty, residents, nursing staff, and private obstetrician-gynecologists, evaluated the activity. In 2017, 15 (35%) completed the evaluation. All respondents intended to change their practice based on findings. In addition, the 2016 ACGME survey indicated significant increases in faculty perception of resident QI from 58% in 2014-2015 to 89% in 2015-2016 (p = .01) and in collaboration in scholarly activity from 50% to 85% (p < .01). Discussion: This curriculum was effective in engaging OB-GYN faculty and residents in formalized problem-based learning to address QI.


Educational Objectives

After completing this curriculum, learners will be able to:

  1. Engage faculty, residents, and private practitioners in formalized quality improvement initiatives.
  2. Propose evidence-based standardization for common procedures.
  3. Disseminate best clinical practice proposals to a target audience of key stakeholders.
  4. Evaluate best-practice proposals for integration into practice.

Figures and Tables

Table 1. Annual Time Line
Month
Task
October
Collect ideas for new proposals for best clinical practices.
October-November
Select proposal topics, mentors, and evaluators.
November
Announce proposal topics, mentors, and evaluators.
November-February
Dyads develop best clinical practice proposals.
Collect data for previous years’ implemented proposals.
March-April Complete analysis of patient-outcomes data.
Develop oral presentations.
Community practitioners evaluate proposals.
Disseminate best clinical practive proposals at annual forum.
Disseminate changes in outcomes from implementation of previous year's proposals. 
April-May
Select best clinical practice proposals for implementation.
June
New study/activity period begins.
Table 2. 2017 Live Forum Speaker Evaluation Scores (N = 15)
Speaker Characteristic
Ma
Quality of presentation skills
3.79
Level of subject knowledge
3.85
Achieved learning objectives
3.83
Topic relevance to practice
3.89
aFour-point Likert scale (1 = Poor, 4 = Excellent).
Table 3. 2017 Live Forum Overall Quality Scores (N = 15)
Item
Ma
The level of material was appropriate for the target audience.
3.93
The amount of material was appropriate for the format.
3.87
Program content was free from commercial bias.
4.00
Teaching methods were appropriate for the program format.
3.93
There was adequate time for questions/audience interaction.
3.60
Information presented will be useful to my medical career.
3.93
aFour-point Likert scale (1 = Disagree, 4 = Agree).

Introduction

Clinical patterns commonly practiced in teaching hospitals are an important predictor of the quality of patient care provided by residents long after they have completed training and as they advance through their careers.1 In 2012, the Accreditation Council for Graduate Medical Education (ACGME) established the Clinical Learning Environment Review (CLER) program to ascertain and address the quality of residency training in hospital sites affiliated with accredited sponsoring institutions.2,3 The ACGME CLER Evaluation Committee reported that from 2012-2015, “many teaching hospitals, medical centers, and ambulatory care practices were not consistently engaging residents and fellows” in opportunities for growth in the areas of health care quality and patient safety.4

The University of Texas at Austin Dell Medical School assumed institutional sponsorship of existing GME programs in Austin, Texas, in January 2015. A core focus of the new medical school is rethinking and redesigning health care that is inclusive and focused on value. In order to address the mission and vision of Dell Medical School and the concerns of the ACGME, educational leadership in the Department of Women’s Health (Obstetrics and Gynecology) identified an opportunity to create a year-round curriculum to engage faculty and trainees in the development of best practices that focus on quality, safety, value, and/or equity and to study the resulting improvement in patient outcomes. The curriculum was designed as a GME and continuing medical education (CME) activity to interactively engage OB-GYN providers, including faculty, residents, and private practitioners. Curricular elements included the development, dissemination, and evaluation of best-practice proposals. In addition, the format was designed to evaluate and discuss patient outcomes using associated statistical data.

This module provided an effective way of utilizing a problem-based approach to incorporate learners into the rapid-cycle quality improvement (QI) process while continuing to deliver evidence-based care. Though there are other QI curricula available, there are none known to focus on the development, implementation, and patient outcomes of best-practice statements authored by resident/faculty dyads.5-8 Furthermore, this resource incorporated local private practitioners who also provided care at the same inpatient and outpatient facilities and who evaluated each proposal for feasibility of standardized implementation. This curriculum, though scalable, was designed to meet the needs of a surgical specialty, specifically, OB-GYN.

Methods

The curriculum was created for OB-GYN faculty and residents using the six-step model.9 For faculty, it was intended to be used as a maintenance of certification (MOC) Part IV activity and a mechanism to update their current knowledge on a specific topic. For residents, it was a mandated resident educational activity and real-world application that supplemented the Institute for Healthcare Improvement Open School modules, which they were required by our institution to complete. The curriculum consisted of five phases: (1) a lecture or asynchronous module (Appendix A); (2) development of a mentored clinical practice proposal using a standard template (Appendix B); (3) implementation, data collection, and analysis for selected proposals; (4) dissemination of proposals and patient outcomes using a presentation template (Appendix C); and (5) evaluation of the proposals and live forum (Appendices D & E, respectively).

Lecture/Asynchronous Module
The rationale for the curriculum was described to the participants during a learning session to differentiate between best-practice statements and clinical guidelines, demonstrate a need to develop standardized practices, and describe the purpose of this QI and outcomes initiative.

Mentored Clinical Practice Proposal Development
PGY1 and PGY4 residents were required to submit project ideas based on the needs of the local health care system. Sources for these ideas included clinical observations, morbidity and mortality conferences, and journal club. Residents were assigned to dyads, with a faculty member acting as a project mentor. Faculty members were paired with residents through a selection process that matched their area of interest or expertise to the residents’ submitted ideas. These groups were provided with an outline for authoring best clinical practice proposals. The template included the following:

  • Proposal title and date.
  • Level of evidence/strength of recommendation.
  • Proposed best practice (500-word limit).
  • Literature review methods.
  • Implementation strategies and barriers to implementation.
  • Suggested outcome measures.
  • Checklist (if applicable).
  • Flow diagram (if applicable).

Proposals were drafted based on the most recent, best-available evidence using a scoring system for quality of evidence and strength of recommendation.10,11 Dyads were asked to also present an implementation strategy to include balancing and processing measures as well as to identify foreseeable barriers such as time, cost, and availability of resources. Furthermore, they were required to suggest patient-outcome measures to be studied immediately following an implementation period of approximately 6 months. Checklists, where applicable, were included in proposals as an implementation tool to ensure that the appropriate steps for new processes or procedures were followed in the correct sequence.

Remaining residents and faculty were assigned to groups for data collection and analysis. These groups were to review and present the change in patient outcomes from the previous year’s implemented proposals.

Data Collection and Analysis for Implemented Proposals
PGY2 and PGY3 residents worked together in assigned teams, each with oversight from one or more faculty mentors, to perform a chart review and measure changes following implementation of the previous year’s proposals. Pre- and postimplementation data were reviewed to determine the degree of improvement in the quality of care and patient outcomes. Quantitative data associated with the use of the best practices were analyzed for defined events that had preexisting data.3,12 Process measurements included the benefit to the patient for the delivery of services following the introduction and implementation of the proposals.12 Outcome measures were also assessed to determine the level of improvement in the patient’s health and, when available, patient satisfaction from the resulting modification of practice.12

Resident team leaders were selected based on original authorship of the best-practice proposal. Specifically, residents were designated to serve as project leaders of the proposals they had written the previous year in their dyads. One or more faculty mentors were assigned responsibility for guiding the data-collection plan, defining outcome measures, and facilitating statistical analysis.

In the initial year of curriculum implementation, all faculty members and residents were assigned to dyads to develop proposals. This increased the number of presented proposals from which to select for execution and analysis. Since data from previous years’ proposals were not available for review, this arrangement incorporated all OB-GYN residents and faculty into the process.

Dissemination of Proposals and Outcomes
Educational leadership selected local, private attendings who provided OB-GYN care within our clinical learning environments to voluntarily assess the proposals. These evaluators were assigned proposals based on their area of interest and expertise and asked to vet the feasibility of implementation. Using a standard evaluation form (Appendix E), they were invited to submit their ratings and comments to leadership prior to the live forum.

Proposals were disseminated during a live, half-day, department-wide forum. Oral presentations were delivered by the residents and were preceded by a grand rounds–style presentation from an invited keynote speaker. Each proposal presentation was limited to 10 minutes and was followed by a 5-minute interactive question-and-answer session with the audience. Question-and-answer sessions were facilitated by a faculty moderator.

In order to capitalize on the maximum potential for interprofessional and multidisciplinary integration into practice, the target audience for the forum included all faculty, residents, nursing staff, and private attendings practicing OB-GYN in our hospital. All attendees at the live forum were permitted to give instant feedback during the question-and-answer sessions and completed a written proposal evaluation immediately following each presentation. Attendees were also invited to submit a written evaluation of the live forum at the end of the session to assess its overall educational value.

Logistical considerations for the live forum included securing a large space for 4 hours with required audiovisual capabilities to accommodate all attendees and presentations. Required audiovisual and computer equipment included a podium with microphone, a laptop computer with PowerPoint capabilities, a projector and screen, and a handheld microphone to facilitate audience interaction.

Evaluation of Proposals and Live Forum
The proposal presentations were judged using a multipronged approach. Selected private practitioners used a standard evaluation tool prior to the live forum (Appendix D). Participants in attendance at the forum used one evaluation tool for the proposal and a separate form to rate the forum (Appendices D & E, respectively). Feedback was also obtained through the interactive question-and-answer session immediately following each presentation. Written evaluations for the proposals included constructs such as quality and application of evidence, likelihood of incorporation of the proposal into clinical practice, and measurability of the outcomes.

Proposal evaluation: As stated previously, prior to the live forum, private practitioners evaluated new proposals using a standardized form to vet the feasibility of each proposal (Appendix D). Evaluators were assigned to proposals based on their area of interest and clinical expertise. Using the same form, attendees were also given the opportunity to provide input during the forum.

Live forum evaluation: Attendees were also invited to evaluate the activity for educational content, development of knowledge, change in competence, and intention to alter practice patterns based on new knowledge gained at the forum (Appendix E). The effectiveness of this activity was reviewed by the curriculum planning committee to inform changes to the format or dissemination methods, if necessary, for the following year.

Selection of Proposals for Future Study
Following the live forum, proposal evaluations from private practitioners were collated with the feedback received on the written evaluations from the live forum and information gathered during the question-and-answer sessions. The curriculum planning committee reviewed the integrated feedback and selected the most appropriate proposals to implement, study, and present the following year. Selected proposals were relayed to the forum participants, and progress was monitored through committee meetings and email communications.

Project Oversight and Other Resources
Other recommended resources for the development and implementation of this ongoing curriculum include a planning committee, faculty mentors, CME credit, MOC credit, and an annual time line.

Planning committee: Our curriculum planning committee provided oversight of the development of best clinical practices. The committee was cochaired by a private practitioner and a full-time faculty member. Administrative support was provided by departmental staff as needed. The committee members consisted of appointed department faculty with an interest in both QI and education. Specific duties of the committee included the following:

  • Mapping each group’s progress against the time line of project milestones.
  • Facilitating the resolution of any issues arising during the development process.
  • Analyzing the evaluations and determining the final selection of proposals to be implemented based on the evaluations.
  • Maintaining responsibility for the oversight and planning of the live forum.
  • Overseeing the CME application and approval process and complying with all Accreditation Council for Continuing Medical Education standards and criteria.
  • Pursuing approval for MOC Part IV.

Faculty mentorship: All proposal mentors were faculty members who had participated in TeamSTEPPS training and possessed knowledge of the local institution’s systems and resources. This included familiarity with hospital leadership and administration, the local electronic medical record system, hospital protocols, and mechanisms in place for monitoring quality and safety. All selected mentors of best-practices proposals possessed the motivation, procedural expertise, and availability to actively participate with residents from conception through completion and dissemination of a proposal.

CME and MOC: This activity qualified for multiple CME credits.13 In addition, the activity was approved by the American Board of Obstetrics and Gynecology as an MOC Part IV—Improvement in Medical Practice activity.14

Annual time line: An annual curriculum time line as illustrated in Table 1 was developed for the Department of Women’s Health.

Table 1. Annual Time Line
Month
Task
October
Collect ideas for new proposals for best clinical practices.
October-November
Select proposal topics, mentors, and evaluators.
November
Announce proposal topics, mentors, and evaluators.
November-February
Dyads develop best clinical practice proposals.
Collect data for previous years’ implemented proposals.
March-April Complete analysis of patient-outcomes data.
Develop oral presentations.
Community practitioners evaluate proposals.
Disseminate best clinical practive proposals at annual forum.
Disseminate changes in outcomes from implementation of previous year's proposals. 
April-May
Select best clinical practice proposals for implementation.
June
New study/activity period begins.

Results

The participants included 23 obstetrician-gynecologists and 20 OB-GYN residents from the Department of Women’s Health at Dell Medical School. Success for the activity was determined from various assessments. Following the first year of implementation, the ACGME faculty survey was used to determine the faculty perception of resident collaboration in QI. The 2015-2016 ACGME survey showed that 89% (n = 20) felt that residents had participated in QI initiatives in 2015-2016. Only 71% (n = 18) in 2013-2014 and 58% (n = 20) in 2014-2015 felt residents had participated in QI. Results from the statistical analysis indicated that the change in faculty perception of resident involvement in QI from 2014-2015 to 2015-2016 improved and was statistically significant (p = .01).12

The activity was also evaluated by participants upon conclusion of the forum (Appendix E). The evaluation tool was used to assess speakers and topics, the overall program, and the impact of the forum on individual practice patterns. Fifteen evaluations were returned, for a response rate of 35%, and the results were tabulated. Mean evaluation scores for speakers and topics and for the overall 2017 program appear in Table 2 and Table 3, respectively.

Table 2. 2017 Live Forum Speaker Evaluation Scores (N = 15)
Speaker Characteristic
Ma
Quality of presentation skills
3.79
Level of subject knowledge
3.85
Achieved learning objectives
3.83
Topic relevance to practice
3.89
aFour-point Likert scale (1 = Poor, 4 = Excellent).
Table 3. 2017 Live Forum Overall Quality Scores (N = 15)
Item
Ma
The level of material was appropriate for the target audience.
3.93
The amount of material was appropriate for the format.
3.87
Program content was free from commercial bias.
4.00
Teaching methods were appropriate for the program format.
3.93
There was adequate time for questions/audience interaction.
3.60
Information presented will be useful to my medical career.
3.93
aFour-point Likert scale (1 = Disagree, 4 = Agree).

Attendees at the live forum were also given the opportunity to provide open-ended responses regarding the impact of the forum on practice patterns and intention to change future practice. One hundred percent of the respondents reported an intention to change their performance as a result of participation in the activity. Key themes drawn from responses follow. The total number of responses listed exceeds 15 due to multiple answers on a single evaluation.

  • Measure patient outcomes (three responses).
  • Utilize experience groups (two responses).
  • Improve patient care through changing practices (three responses).
  • Improve surgical techniques (four responses).
  • Follow recommendations for management or screening of conditions (13 responses).

Discussion

The ACGME mandates resident participation in QI initiatives. Pairing faculty and residents together to develop best-practice statements for OB-GYN was an effective ongoing format to engage both faculty and residents in QI initiatives. While the level of evaluation for the forum and proposals measured only perception and intent to change practice, the proposals selected for implementation provided an opportunity for the department to analyze and disseminate patient outcomes as a result of their efforts. We believe that the curriculum is scalable as a unique resource to build relationships amongst faculty, residents, and other practitioners and to develop accountability for processes through participating in formalized QI activities, proposing best practices, and evaluating the feasibility of proposals for integration into practice.

The curriculum is limited in that it was conducted for a single specialty at a single institution. In addition, it was resource intensive in terms of time needed both to oversee an ongoing QI project and to conduct a live forum for the proposal presentations. It was also intensive in personnel resources, and other institutions may find identifying the appropriate number of mentors to partner with residents difficult. Furthermore, interprofessionals including physician assistants, nursing staff, and other health care providers were not engaged in the proposal development process.

Lessons learned include encouraging residents to evaluate their mentors to aid in the continuing professional development of the faculty, enlisting the aid of residency coordinators in the planning process to help monitor residents’ progress, and incorporating residents into the initial planning stages of the curriculum to obtain their feedback and support for the process. Furthermore, proposal presentations were approximately 10 minutes each, with interactive question-and-answer sessions immediately following each presentation. Time constraints may dictate the need to hold multiple sessions over an extended period of time as a regularly scheduled didactic series rather than one extended half-day live forum.

Future studies should be conducted to determine the efficiency and effectiveness of the implementation process as well as to measure long-term changes in individual practice habits. As part of the ongoing assessment process, implemented practice proposals will be reevaluated for relevance and revision after 2 years or on an as-needed basis as new scientific evidence becomes available.


Author Information

  • Emily K. Vinas, EdD: Assistant Professor, Department of Women’s Health, University of Texas at Austin Dell Medical School; Director of Educational Strategy and Program Development, Department of Women’s Health, University of Texas at Austin Dell Medical School
  • Amanda B. White, MD: Assistant Professor, Department of Women’s Health, University of Texas at Austin Dell Medical School
  • Rebecca G. Rogers, MD: Professor, Department of Women’s Health, University of Texas at Austin Dell Medical School; Associate Chair of Clinical Integration and Operations, Department of Women’s Health, University of Texas at Austin Dell Medical School
  • Jeffrey J. Ridgeway, MD: Clinical Assistant Professor, Department of Women’s Health, University of Texas at Austin Dell Medical School
  • Amy E. Young, MD: Professor, Department of Women’s Health, University of Texas at Austin Dell Medical School; Chair, Department of Women’s Health, University of Texas at Austin Dell Medical School

Disclosures
None to report.

Funding/Support
None to report.

Prior Presentations
Vinas E, White A, Ridgeway J, Young A. Avoiding the poisoned apple-clinical guidelines forum: a unique strategy to engage residents, faculty, and community providers in continuous quality improvement. Abstract presented at: Council on Resident Education in Obstetrics and Gynecology & Association of Professors of Gynecology and Obstetrics Annual Meeting; March 3-8, 2017; Orlando, FL.

Ethical Approval
Reported as not applicable.


References

  1. Asch DA, Nicholson S, Srinivas S, Herrin J, Espstein AJ. Evaluating obstetrical residency programs using patient outcomes. JAMA. 2009;302(12):1277-1283. https://doi.org/10.1001/jama.2009.1356
  2. Weiss KB, Bagian JP, Nasca TJ. The clinical learning environment: the foundation of graduate medical education. JAMA. 2013;309(16):1687-1688. https://doi.org/10.1001/jama.2013.1931
  3. Weiss KB, Wagner R, Nasca TJ. Development, testing, and implementation of the ACGME Clinical Learning Environment Review (CLER) program. J Grad Med Educ. 2012;4(3):396-398. https://doi.org/10.4300/JGME-04-03-31
  4. Nasca TJ. Introduction to the CLER National Report of Findings 2016. J Grad Med Educ. 2016;8(2)(suppl 1):7-9. https://doi.org/10.4300/1949-8349.8.2s1.7
  5. Keefer P, Orringer K, Vredeveld J, Warrier K, Burrows H. Developing a quality improvement and patient safety toolbox: the curriculum. MedEdPORTAL. 2016;12:10385. https://doi.org/10.15766/mep_2374-8265.10385
  6. Reed D, Wittich C, Drefahl M, McDonald F. A quality improvement curriculum for internal medicine residents. MedEdPORTAL. 2009;5:7733. https://doi.org/10.15766/mep_2374-8265.7733
  7. Tad-y D, Price L, Cumbler E, Levin D, Wald H, Glasheen J. An experiential quality improvement curriculum for the inpatient setting—part 1: design phase of a QI project. MedEdPORTAL. 2014;10:9841. https://doi.org/10.15766/mep_2374-8265.9841
  8. Werner JA. An integrated, multimodal resident curriculum in patient safety and quality improvement. MedEdPORTAL. 2017;13:10641. https://doi.org/10.15766/mep_2374-8265.10641
  9. Thomas PA, Kern DE, Hughes MT, Chen BY, eds. Curriculum Development for Medical Education: A Six-Step Approach. 3rd ed. Baltimore, MD: Johns Hopkins University Press; 2016.
  10. Tricoci P, Allen JM, Kramer JM, Califf RM, Smith SC. Scientific evidence underlying the ACC/AHA clinical practice guidelines. JAMA. 2009;301(8):831-841. https://doi.org/10.1001/jama.2009.205
  11. Women’s Health Care Physicians. Reading the medical literature. American College of Obstetricians and Gynecologists website. http://www.acog.org/Resources-And-Publications/Department-Publications/Reading-the-Medical-Literature. Accessed May 1, 2017.
  12. Vinas E, White A, Ridgeway J, Young A. Avoiding the poisoned apple-clinical guidelines forum: a unique strategy to engage residents, faculty, and community providers in continuous quality improvement. Abstract presented at: Council on Resident Education in Obstetrics and Gynecology & Association of Professors of Gynecology and Obstetrics Annual Meeting; March 3-8, 2017; Orlando, FL.
  13. Performance Improvement Continuing Medical Education (PICME). American Medical Association website. https://www.ama-assn.org/education/performance-improvement-continuing-medical-education-pi-cme. Accessed May 1, 2017.
  14. Part IV—Improvement in Medical Practice. American Board of Obstetrics and Gynecology website. https://www.abog.org/new/information.aspx?cat=moc&id=6. Accessed December 1, 2017.


Citation

Vinas EK, White AB, Rogers RG, Ridgeway JJ, Young AE. Developing best clinical practices through outcomes improvement: an ongoing quality improvement curriculum for faculty and residents. MedEdPORTAL. 2018;14:10676. https://doi.org/10.15766/mep_2374-8265.10676

Received: June 26, 2017

Accepted: January 22, 2018