Original Publication
Open Access

Innovation to Dissemination Workshop: Selecting Outcome Measures to Translate Educational Innovations Into Scholarship

Published: October 9, 2018 | 10.15766/mep_2374-8265.10759

Appendices

  • Workshop Agenda.docx
  • Innovation to Dissemination Presentation.pptx
  • Handout for Participants.docx
  • Evaluation Template.docx

All appendices are peer reviewed as integral parts of the Original Publication.

To view all publication components, extract (i.e., unzip) them from the downloaded .zip file.


Abstract

Introduction: Curricular innovations are invaluable to the improvement of medical education programs, and thus, their dissemination to broader audiences is imperative. However, medical educators often struggle to translate innovative ideas into scholarly pursuits due to a lack of experience or expertise in selecting outcome measures that demonstrate impact. A recent national call for increased focus on outcome measures for medical education research highlights the need for more training in this area. Methods: We developed a 2-hour interactive workshop to improve educator ability to identify outcome measures for educational innovations. This workshop was delivered at a national pediatrics educational conference and at three local institutional faculty development sessions. Results: Participants were diverse in terms of experience, expertise, and roles within their educational programs. Participants rated the workshop positively in each setting and identified next steps in developing their own products of educational scholarship. Discussion: This workshop can provide faculty and faculty developers with a template for developing a skill set in identifying outcome measures and pairing them with educational innovations.


Educational Objectives

By the end of this activity, learners will be able to:

  1. Describe the purpose of identifying outcome measures for educational innovations.
  2. Identify an educational innovation that could be disseminated.
  3. Develop a preliminary research question related to the innovation.
  4. Outline the key components of effective outcome measures.
  5. Design a research plan for the innovation.

Figures and Tables

Table 1. Internal Evaluations Obtained at Johns Hopkins All Children’s  Hospital and
Virginia Commonwealth University (n = 27)
Prompt
M (SD)a
Overall, the workshop was effective.
4.63 (0.49)
Overall, the speakers were effective.
4.65 (0.49)
The format of this activity was appropriate for its content.
4.67 (0.55)
This activity was a worthwhile investment in my professional development.
4.70 (0.47)
I learned new knowledge and skills from this activity.
4.21 (0.79)
I will apply the knowledge and skills.
4.52 (0.58)
This activity is relevant to my professional role.
4.59 (0.50)
aRated on a 5-point scale (1 = strongly disagree, 5 = strongly agree).
Table 2. Internal Evaluations Obtained at University of Nebraska (n = 10)
Prompt
M (SD)a
Before this event, my level of knowledge about this topic was adequate.
3.0 (0.94)
After this event, my level of knowledge about this topic was enhanced.
4.4 (0.52)
Following the event, I will be better able to identify an educational innovation I have developed which could be disseminated.
4.6 (0.52)
Following the event, I will be better able to develop a preliminary research question which relates to my innovation.
4.8 (0.42)
Following the event, I will be better able to design a research plan for my innovation.
4.5 (0.53)
The speaker was an effective presenter in this subject area.
4.7 (0.48)
aRated on a 5-point scale (1 = strongly disagree, 5 = strongly agree).

Introduction

Curricular innovations are invaluable to the improvement of medical education programs, and thus, their dissemination to broader audiences is imperative. The successful dissemination of an educational innovation requires alignment with Glassick’s criteria1 for scholarship through a clear articulation of the problem, consideration of solutions, discussion of outcomes, and reflective critique.2 A focus on outcome measures has been pursued with increasing interest in medical education,3,4 thus increasing the level of rigor required for dissemination of educational research.

In recent years, medical schools have developed tracks for clinician-educators. In addition to their having excellent teaching skills, there is increased demand for these faculty to produce educational scholarship.5 While clinician-educators often strive for dissemination of their work, selecting appropriate outcome measures can be challenging. Linking research questions to outcome measures such as Miller’s pyramid of clinical competence or Kirkpatrick’s four-level training evaluation model can be a way to measure the outcome of an educational innovation and clinical competence.6,7

Other authors have described faculty and trainee development workshops aimed at improving the experience and expertise of faculty and trainees in medical education in general,8,9 as well as educational scholarship more specifically developing educational scholarship.8-12 However, published workshops generally focus globally on educational research methods rather than targeting outcome measures explicitly. One recently published workshop did incorporate the concept of outcome measures: Li, Gusic, Vinci, Szilagyi, and Klein developed a workshop to train faculty how to write effectively and incorporated recognition of outcome measures with illustrative examples into the workshop.11 While the concepts of Miller’s pyramid and Kirkpatrick’s hierarchy were introduced in that workshop, application of these frameworks to scholarly work was not emphasized.

We developed and implemented an interactive 2-hour faculty development workshop to improve medical educators’ ability to select appropriate outcome measures to assess the impact of their innovative work. The target audience for this workshop was faculty and/or trainees who have innovative ideas but may be unfamiliar with choosing outcomes for their work. The content of the workshop was developed using Miller’s pyramid and Kirkpatrick’s hierarchy,6,7 Glassick’s criteria for describing scholarship,1 and published literature suggesting best practices for disseminating educational innovations.2 Collectively, these resources provided a framework for the organization and content included in the workshop.

Methods

This 2-hour workshop was selected through a rigorous peer-review process and was initially presented at the 2016 Council on Medical Student Education in Pediatrics (COMSEP) annual meeting. Subsequently, over the 2016-2017 and 2017-2018 academic years, the workshop was delivered at three institutions: the Johns Hopkins All Children’s Hospital (JHACH), the University of Nebraska Medical Center (UNMC), and the Virginia Commonwealth University School of Medicine (VCU). The contexts for these presentations were departmental or institutional faculty development conferences.

Workshop facilitators were members of the COMSEP Research Scholarship collaborative who had experience in publishing peer-reviewed educational innovations as well as presenting findings at local, regional, and national meetings. All were medical educators with expertise in educational research design, implementation, and dissemination. At some sites, additional facilitators with similar qualifications were recruited and trained at our home institutions to assist with instruction. Prerequisite knowledge required some experience with medical education research design. Faculty who presented the workshop at COMSEP had all been engaged in the workshop’s development; therefore, no specific training was provided to those facilitators. Additional facilitators recruited for local workshops were provided with the agenda (Appendix A), the slides (Appendix B), and the worksheet (Appendix C). They also received face-to-face training with one of the original facilitators in an informal manner prior to conducting the workshop locally.

Although the workshop was initially created for a national conference of medical educators in pediatrics, the content was not specialty specific. Following initial presentation at COMSEP, attendees at subsequent presentations included medical educators and clinical faculty from a variety of disciplines. The target audience for the curriculum was the busy clinical educator who implemented new curricula, innovations, or teaching strategies regularly but did not know how to evaluate the outcomes of this work. No prerequisite knowledge was required to attend the workshop.

In order to facilitate interaction during the workshop, room setup should include a screen, projector, and, if possible, round tables to encourage small-group interaction. At one institution, a small lecture hall was used, and participants worked in small groups of three or four. The workshop agenda (Appendix A) featured a suggested schedule for workshop activities, timing, and corresponding slides to orient the presenter. The slides used in the presentation were provided as well (Appendix B). Finally, a worksheet (Appendix C) was distributed to each attendee to allow for note-taking and to illustrate major take-home points from each component of the presentation.

The workshop centered on five learning objectives. By the conclusion of the workshop, a participant would be able to (1) describe the purpose of identifying outcome measures for educational innovations, (2) identify an educational innovation previously developed that could be disseminated, (3) develop a preliminary research question related to the innovation, (4) outline the key components of effective outcome measures, and (5) design a research plan for the innovation.

The instructional methods involved a combination of didactics balanced with individual/small-group work. The workshop began with introductions of the speakers, followed by participants introducing themselves. The participants were asked to indicate their experience with disseminating scholarly work based on innovative educational interventions. This needs-assessment exercise provided the presenters with a sense of learner needs to better assist during the exercise times of the session. Then, the first didactic/discussion addressed the reasons medical educators should consider outcomes of their innovation and dissemination of the results. Various examples were discussed along with how to map an innovation to Glassick’s criteria for scholarship.1 Following this discussion, participants were given time to brainstorm about possible innovations they were considering. Participants were offered the chance to share at their table or in a large group an idea they had in mind.

After this portion, the next didactic focused on contextualizing the proposed innovation in comparison to other similar innovations. This section highlighted the value of conducting a comprehensive literature search, including a search of nonmedical literature (e.g., other health professions, psychology, education, etc.). Findings from the literature search were then suggested as a means to help develop and refine a research question related to the participant’s innovation. Time was devoted to discussing the importance of crafting a research question for the purposes of specificity, data collection, and data analysis. The second working section let participants draft a research question. Once a question had been drafted, they were asked to share it with other participants and/or the facilitators to get help refining it.

Key considerations for outcomes were addressed in the next section. Examples from the literature were highlighted to illustrate the levels of outcomes addressed for specific innovations. Miller’s pyramid6 and Kirkpatrick’s model7 were used as guiding frameworks to evaluate published innovations. We chose two articles with different outcome levels to highlight that while aiming for higher level outcomes was desirable, outcomes of knowledge acquisition or increased confidence could also be published depending on what had already appeared in the literature.

The next section provided a list of typical outcome measures (e.g., global rating scales, reflective essays, direct observation) used in medical education research. An illustrative example was offered for six of these measures. One illustrative example was assigned to each small group, and the group was asked to evaluate its measure in terms of use, strengths, weaknesses, and level of outcome (using Miller’s and Kirkpatrick’s models). The small groups then reported their findings to the large group. Next, the facilitator compared and contrasted the measures with the group. Learners were asked to critically reflect on their own innovation and consider which outcome measures might be the best fit for their own work.

To conclude the workshop, we addressed how to identify existing instruments in the literature that could be used or adapted by the learner. We discussed resources such as MedEdPORTAL’s Directory and Repository of Educational Assessment Measures (which has subsequently been incorporated into the MedEdPORTAL collection) and conducted PubMed searches to identify other instruments. Finally, discussion centered around published instruments outside of medicine and how to search for such tools.

The workshop was evaluated at the COMSEP meeting as well as locally. COMSEP provided its standard evaluation template (Appendix D) for workshops. Locally, continuing education offices used their own evaluation tools. The national and local evaluation tools featured a combination of scaled items and qualitative feedback. Evaluations for each presentation were anonymous and collected at the end of the presentation. Attendee feedback was considered after each workshop, and modifications were made for subsequent iterations of the workshop.

Results

Approximately 25 participants attended the workshop at the COMSEP annual meeting in March 2016. Participants were either pediatric departmental faculty or pediatric clerkship administrators and represented the spectrum from junior faculty to senior faculty, with leadership positions in the department and/or school of medicine. Fifteen participants completed the external evaluations provided by COMSEP. These evaluations asked participants to rate on a 5-point scale (1 = strongly disagree, 5 = strongly agree) their agreement with three statements: “Overall, the workshop was effective,” “The speakers were effective,” and “I achieved my learning goals for this session.” The means and standard deviations for each prompt were the same: 4.30 and 0.49, respectively. In addition to the quantitative prompts, participants were asked to provide narrative comments regarding the most helpful features of the workshop and suggestions for making the workshop more effective in the future. A total of five responses were received to these questions. One suggestion for improvement was the inclusion of a list of target journals for dissemination. Finally, participants were asked to commit to one idea resulting from the workshop. Seven people responded to this prompt, which resulted in a variety of comments ranging from planning a research question to choosing outcome measures to developing other faculty. Minor modifications were made to the structure of the workshop based on participant feedback.

The workshop was subsequently conducted at JHACH (October 2016), VCU (February 2017), and UNMC (October 2017). The total number of participants ranged from 10 at VCU and UNMC to 17 at JHACH. All participants were faculty at the respective institution. Disciplines represented were diverse (medical, surgical, basic and clinical sciences) although specific demographic data were not collected. The evaluation forms used at JHACH and VCU were the same and were based on the format used at COMSEP. The UNMC used a standard format for all its faculty development workshops. Therefore, the UNMC questions were slightly different from those used at the other two sites. In each iteration, the majority of the participants thought that the workshop and speakers were effective and that the workshop was worthwhile for professional development. The majority of the participants learned new knowledge and skills and acknowledged that the knowledge and skills would be applied and that the workshop was relevant to their career. The quantitative evaluation data from JHACH and VCU are provided in Table 1. The quantitative data from UNMC are provided in Table 2.

Table 1. Internal Evaluations Obtained at Johns Hopkins All Children’s  Hospital and
Virginia Commonwealth University (n = 27)
Prompt
M (SD)a
Overall, the workshop was effective.
4.63 (0.49)
Overall, the speakers were effective.
4.65 (0.49)
The format of this activity was appropriate for its content.
4.67 (0.55)
This activity was a worthwhile investment in my professional development.
4.70 (0.47)
I learned new knowledge and skills from this activity.
4.21 (0.79)
I will apply the knowledge and skills.
4.52 (0.58)
This activity is relevant to my professional role.
4.59 (0.50)
aRated on a 5-point scale (1 = strongly disagree, 5 = strongly agree).
Table 2. Internal Evaluations Obtained at University of Nebraska (n = 10)
Prompt
M (SD)a
Before this event, my level of knowledge about this topic was adequate.
3.0 (0.94)
After this event, my level of knowledge about this topic was enhanced.
4.4 (0.52)
Following the event, I will be better able to identify an educational innovation I have developed which could be disseminated.
4.6 (0.52)
Following the event, I will be better able to develop a preliminary research question which relates to my innovation.
4.8 (0.42)
Following the event, I will be better able to design a research plan for my innovation.
4.5 (0.53)
The speaker was an effective presenter in this subject area.
4.7 (0.48)
aRated on a 5-point scale (1 = strongly disagree, 5 = strongly agree).

All sites’ evaluations included an opportunity for narrative comments as well. In their responses, participants noted several concepts they planned on applying in their work, including designing research plans using an innovative approach, utilizing Miller’s and Kirkpatrick’s frameworks to consider existing outcome measures, and improving on outcomes to further patient care. Areas for improvement included issues with the timing/duration of the workshop (e.g., “noon to 2pm is challenging”), lack of protected time for research, and offering the workshop earlier in the careers of junior faculty.

Discussion

As stated by Chen, Bauchner, and Burstin, “the primary goal of medical education is to produce physicians who deliver high-value care.”3 It is therefore critical to carefully consider evaluating medical education research based on the outcomes it provides not only for the development of trainees’ knowledge and skills but ultimately for the outcomes of their patients. While previous work has offered general guidance about methods for developing research questions and disseminating educational scholarship on a global level,8-12 this workshop provides a focused and deliberate effort to enhance accurate and optimal identification of outcome measures to align with a scholar’s innovative research question. The workshop has been well received at one national and three local presentations to faculty of various disciplines. Thus, it has demonstrated feasibility and adaptability on a multi-institutional level and has been refined based on learner feedback. Furthermore, we have demonstrated that instructors who were not part of the original workshop can be trained to implement it.

While the focus of this workshop is primarily on selection of outcome measures, we purposely designed it with a broad lens to include some introductory material around scholarship in general and the development of the research question. We made these decisions in recognition of the context in which the workshop would fit within the careers of our attendees. For many of our participants, the workshop was their first exposure to medical education scholarship. Exposure to the tenets of sound medical education scholarship is critical before learners embark on their own research plans for their outcome measures. Feedback from our participants resonated with this sentiment, and some asked for even more time to discuss general principles of research question development.

The structure of this workshop can be modified to fit the resource constraints of the setting and context for participants and facilitators. For example, we have presented this workshop with as few as one facilitator and as many as seven. With smaller numbers, the facilitator migrated from table to table during small-group sessions to capture the essence of the conversation and only intermittently interjected. With a larger number, each facilitator was positioned within a small group to help guide the conversation and recap to the larger group during debriefs. In addition, the workshop has been delivered as either a faculty development series or a stand-alone workshop at a national conference. Based on the context of the workshop and the characteristics of the learners, more or less time could be spent on the discussion of fundamental principles of writing research questions. If the workshop were delivered in the context of a larger faculty development series, the content could be adjusted to focus more explicitly on outcome measure selection.

There are several limitations to this work. First, the focus of measurement for impact of the workshop was limited to reactions of participants. While of a lower tier of measure of impact, our results demonstrate the value and feasibility of this key learning intervention. Value and feasibility are needed prior to proceeding to higher level measures. In addition, we demonstrated feasibility with medical educators of different disciplines. The optimal impact measure would be to determine if participants are able to apply the selection of outcome measures to their work, which should be a consideration of future studies. Second, our guide best suits quantitative medical education research and quantitative outcome measures for evaluating such scholarship. Further refinement of the workshop could include more discussion about qualitative and/or mixed methodology.

In summary, this workshop and corresponding facilitator’s guide address an important gap in medical education research: specifically, the selection of outcome measures. The guidance provided by the workshop is grounded in sound conceptual frameworks and assessment/learning theory. We suggest that faculty developers consider the use of content from this resource to promote faculty development of clinician-educators who seek guidance in the dissemination of their scholarly innovations.


Author Information

  • Michael S. Ryan, MD, MEHP: Associate Professor, Department of Pediatrics, Virginia Commonwealth University School of Medicine
  • Patricia D. Quigley, MD, MME: Assistant Professor, Department of Pediatrics, Johns Hopkins All Children’s Hospital
  • Clifton C. Lee, MD: Associate Professor, Department of Pediatrics, Virginia Commonwealth University School of Medicine
  • Ian Chua, MD: Fellow, Department of Pediatrics, Children’s National Medical Center
  • Caroline Rose Paul, MD: Assistant Professor (CHS), Department of Pediatrics, University of Wisconsin School of Medicine and Public Health
  • Joseph Gigante, MD: Professor, Department of Pediatrics, Vanderbilt University School of Medicine
  • Gary Beck Dallaghan, PhD: Director of Educational Scholarship, Office of Medical Education, University of North Carolina at Chapel Hill School of Medicine

Disclosures
None to report.

Funding/Support
None to report.

Prior Presentations
Ryan MS, Beck Dallaghan G, Chua I, et al. Innovation to dissemination: selecting outcome measures to translate educational innovations into scholarship. Workshop presented at: Council on Medical Student Education in Pediatrics Annual Meeting; April 2016; St. Louis, MO.

Ethical Approval
Reported as not applicable.


References

  1. Glassick CE. Boyer’s expanded definitions of scholarship, the standards for assessing scholarship, and the elusiveness of the scholarship of teaching. Acad Med. 2000;75(9):877-880.
  2. Kanter SL. Toward better descriptions of innovations. Acad Med. 2008;83(8):703-704. https://doi.org/10.1097/ACM.0b013e3181838a2c
  3. Chen FM, Bauchner H, Burstin H. A call for outcomes research in medical education. Acad Med. 2004;79(10):955-960. https://doi.org/10.1097/ACM.0b013e3181838a2c
  4. O’Malley PG, Pangaro LN. Research in medical education and patient-centered outcomes: shall ever the twain meet? JAMA Intern Med. 2016;176(2):167-168. https://doi.org/10.1001/jamainternmed.2015.6938
  5. Sherbino J, Frank JR, Snell L. Defining the key roles and competencies of the clinician–educator of the 21st century: a national mixed-methods study. Acad Med. 2014;89(5):783-789. https://doi.org/10.1097/ACM.0000000000000217
  6. Miller GE. The assessment of clinical skills/competence/performance. Acad Med. 1990;65(9)(suppl):S63-S67.
  7. Kirkpatrick DL. Evaluation of training. In: Craig RL, Bittel LR, eds. Training and Development Handbook. New York, NY: McGraw-Hill; 1967:87-112.
  8. Martin SK, Ahn J, Farnan JM, Fromme HB. Introduction to curriculum development and medical education scholarship for resident trainees: a webinar series. MedEdPORTAL. 2016;12:10454. https://doi.org/10.15766/mep_2374-8265.10454
  9. Williams R, Holaday L, Lamba S, Soto-Greene M, Sanchez JP. Introducing trainees to medical education activities and opportunities for educational scholarship. MedEdPORTAL. 2017;13:10554. https://doi.org/10.15766/mep_2374-8265.10554
  10. Uijtdehaage S, Kalishman S, O’Sullivan P, Robins L. How to succeed as a medical education scholar: identifying your individual strategy and creating a roadmap for scholarship. MedEdPORTAL. 2013;9:9472. https://doi.org/10.15766/mep_2374-8265.9472
  11. Li S-TT, Gusic ME, Vinci RJ, Szilagyi PG, Klein MD. A structured framework and resources to use to get your medical education work published. MedEdPORTAL. 2018;14:10669. https://doi.org/10.15766/mep_2374-8265.10669
  12. Lebeau R. Guiding educational research projects: activity-based workshops on writing a literature review and developing research questions. MedEdPORTAL. 2015;11:10143. https://doi.org/10.15766/mep_2374-8265.10143


Citation

Ryan MS, Quigley PD, Lee CC, et al. Innovation to dissemination workshop: selecting outcome measures to translate educational innovations into scholarship. MedEdPORTAL. 2018;14:10759. https://doi.org/10.15766/mep_2374-8265.10759

Received: May 18, 2018

Accepted: August 31, 2018