Original Publication
Open Access

Satisfaction Academy: A Novel Residency Curriculum to Improve the Patient Experience in the Emergency Department

Published: August 15, 2018 | 10.15766/mep_2374-8265.10737


  • Satisfaction Academy Curriculum Outline.pdf
  • Faculty Instructional Guide.docx
  • Presurvey.docx
  • Introduction to the Satisfaction Academy.pptx
  • Courtesy and Respect.pptx
  • Quarter 1 Cases.docx
  • Therapeutic Interventions.pptx
  • Quarter 2 Cases.docx
  • Communication.pptx
  • Quarter 3 Cases.docx
  • Quality of Care.pptx
  • Quarter 4 Cases.docx
  • Postsurvey.docx

All appendices are peer reviewed as integral parts of the Original Publication.

To view all publication components, extract (i.e., unzip) them from the downloaded .zip file.


Introduction: Patient satisfaction is a key indicator of health care value and an increasingly important metric used to assess emergency physician performance and often reimbursement. To our knowledge, there is no standardized curriculum within emergency medicine (EM) residency programs that focuses on the patient experience in EM. Methods: Our novel resident curriculum is an organized approach to enhancing patient-centered care by optimizing the patient experience. It spans the academic year, with key topics organized into a quarterly time line. Topics include physician courtesy and respect, pain management, discussion of diagnostic and therapeutic interventions, timely communication, and delivery of quality care. Each quarter has three components: introduction/didactics, an interactive workshop, and stories and reflection. The instructional methods used include didactic lectures, role-playing, and group reflection and storytelling. Results: Of 44 participants, 54.5% completed a preintervention survey, and 45.5% completed a postintervention survey. The surveys consisted of 5-point Likert scales measuring degree of agreement with statements that reflected desired behaviors and/or attitudes. On the postintervention survey, participants gave scores indicating general agreement with desired behaviors including sitting at the bedside, acknowledging all persons in the room, and giving an anticipated disposition, as well as with feeling more knowledgeable about patient satisfaction. Discussion: Our Satisfaction Academy has filled a significant gap related to enhancing the patient experience. This curriculum is generalizable to other EM residency programs, and the interactive peer-to-peer format is both engaging and customizable.

Educational Objectives

By the end of this curriculum, learners will be able to:

  1. Identify areas for improvement in the patient experience within their emergency department.
  2. Implement bedside skills and approaches to overcome barriers to patient satisfaction.
  3. Rehearse and integrate new techniques through group discussion and workshops.
  4. Use stories and reflection as a group to build a collective culture that values patient satisfaction and the desired behaviors described.


Patient satisfaction as measured by postdischarge surveys is embraced by the Centers for Medicare and Medicaid Services as a key indicator of health care value, with increasing financial implications for both individual providers and hospitals.1 Studies show that a positive patient experience correlates to patient safety and clinical effectiveness across a wide spectrum of practice environments.2-4 Thus, patient satisfaction and patient experience are increasingly important metrics used to assess emergency physician performance and sometimes impact physician compensation.

While employee training in customer service is common in many industries, the medical profession has only recently embraced training programs geared toward improving provider performance with patient satisfaction. Early studies of training programs for emergency department providers demonstrated improved performance on satisfaction surveys, patient complaints, and compliments.5,6 Despite this evidence and the growing importance of patient satisfaction in the practice landscape, there is a paucity of formal training in patient satisfaction for residents in emergency medicine (EM). A recent prospective survey analysis of the Council of Emergency Medicine Residency Directors membership found that only 35% of residencies had an organized patient experience curriculum.7 This suggests a profound skill gap for graduates of EM residencies as they enter independent practice.

In an effort to address these problems, we present a novel curriculum called Satisfaction Academy to educate EM resident physicians on patient-centered care and optimizing patient satisfaction. Our intended audience is EM residents, and our curriculum is specifically designed to be given during weekly resident conference time. Participation of EM faculty, students, and other ancillary staff (e.g., nursing) may enrich the learning experience but is not the primary target of this course.

Our multimodal curriculum spans the academic year and incorporates lectures, workshops, and small-group sessions. In addition, the program has been designed to be customizable so as to serve as a model for other residency programs both within and outside EM. Unlike a previous publication that described a 4-week resident course on patient satisfaction,8 our curriculum is more comprehensive and designed for to be taught over a 12-month time line (July-June of an entire academic year). It has also been designed with an emphasis on experiential learning. By utilizing spaced repetition of patient experience motifs via email reminders, conference presentations, and group discussion over the course of 12 months, our aim was to entrain the patient experience mind-set among all resident providers. In the initial implementation of this curriculum, we sought to describe and measure resident attitudes and behaviors both before and after the intervention. In the future, we will seek to obtain actual patient satisfaction data or resident ratings based on patient survey data, although in the present case, we were limited to department-level data, and there were too many confounders to be able to draw firm conclusions about the impact of our curriculum on patient satisfaction. Currently, resident-specific patient satisfaction data are not collected at our institution.

The purpose of this resource is to educate residents on the importance of patient satisfaction and the patient experience to their practice, as these may have both interpersonal and reimbursement implications. For example, patients who are satisfied with their emergency department experience are usually less likely to file complaints and more likely to engage with their provider than are patients who have a bad emergency department experience. In addition, some practices may tie individual provider satisfaction ratings to compensation, meaning that providers with higher patient satisfaction ratings may have a higher earning potential than providers with lower satisfaction ratings. This resource also serves to improve the doctor-patient relationship by focusing on key aspects of the EM patient encounter. To our knowledge, there are few if any longitudinal curricula in EM residency programs that focus on improving the patient experience.


We developed a longitudinal curriculum (Appendix A) aimed at educating residents on how to improve the patient experience in our emergency department by modifying resident behaviors and attitudes. Examples of these included bedside tools such as AIDET (acknowledge, introduce, duration, explain, thank), sitting at the bedside while interviewing patients, discussing non-narcotic pain medication options, reviewing discharge instructions with patients, and communicating quality of care delivered. The structure and the longitudinal nature of this curriculum provided repetition and familiarity with the mission to improve patient-centered care and improved patient satisfaction.

The curriculum was inspired by a desire to address departmental patient satisfaction. Patient survey results were analyzed, and a longitudinal course was developed with emphasis on four pillars of patient satisfaction derived from that analysis: courtesy and respect, therapeutic interventions, communication, and quality of care. The curriculum was implemented during the 2016-2017 academic year as part of the EM weekly resident conference, as we felt this venue would be the most efficient and impactful way to deliver the curriculum content and achieve our educational objectives. A total of 12 presentations/gatherings over a 12-month academic year were held. Four of these included introductions to each of the four topics covered in the course. Four were small-group discussion workshops that included opportunities to role-play using the skills and behaviors taught. Four were times for review and reflection. An instructional guide has been included to aid faculty or other facilitators of the content. See Appendix B for further information. Participants included 44 EM residents, EM academic faculty, and rotating Wake Forest University School of Medicine students.

The content was created as a four-part series to be delivered throughout the academic year during the weekly resident conference. Each quarter focused on a single aspect of patient satisfaction. Within each quarter were three distinct sections, one per month. The first section of each quarter introduced the topic for that quarter using a 20-minute didactic format. During the second section, conference attendees (i.e., residents, students, faculty) were divided into groups of four to eight to discuss the cases and scenarios assigned to that topic. Those scenarios were role-played, and feedback was given on the interaction. The final section in each quarter was an audience-led experience during which faculty and residents shared success stories and thoughts about how they had been able to incorporate the newly learned skills into their clinical practice. The curriculum design was structured as follows:

  • Twelve months of the academic year (July-June).
  • Four quarters:
    • Quarter 1: July-September.
    • Quarter 2: October-December.
    • Quarter 3: January-March.
    • Quarter 4: April-June.
  • Three sections within each quarter (each section covering 1 month in the quarter):
    • Section 1: introduction/didactics.
    • Section 2: workshop.
    • Section 3: reflection/discussion.

Each subsequent quarter followed the same pattern, as visualized in the Figure.

Figure. Satisfaction Academy outline.

Beginning in the first month of the resident academic year (i.e., July) and prior to the introductory lecture to the curriculum, we administered a preintervention survey to assess perceptions, attitudes, and behavioral practices regarding patient satisfaction. See Appendix C for the preintervention survey questions. We used the secure web application REDCap to create and administer the survey. We administered our survey only to EM residents in our program. We allowed approximately 2-3 weeks for residents to complete and return the survey. Then, we collected the data and calculated the mean responses to the survey items on a 5-point Likert scale (1 = never, 5 = always), as well as determining the 95% confidence interval (CI) for each survey item.

Quarter 1 of the curriculum began in July. It focused on conveying courtesy and respect to patients and their family and friends while establishing rapport within the first 5 seconds of the emergency department encounter. Following administration of the preintervention survey, we delivered the introductory lecture for the curriculum. See Appendix D for the introductory PowerPoint. The introductory lecture and the first of the four topical introductory lectures could be given within the same 30-minute time block during resident conference in July. The curriculum introduction did not take more than 5-10 minutes and served as a general introduction that stated the vision, goals, and objectives of the course. Then, the introduction to the first of the four core topics, titled “Courtesy, Respect, and the First 5 Seconds,” was delivered. See Appendix E for this PowerPoint.

During the next month (August), a 1-hour block of time during weekly resident conference was scheduled to review the cases in Appendix F during small-group workshops that corresponded to the Quarter 1 topic. Residents, faculty, students, and other attendees were divided into groups of four to eight. Each group discussed each of the five parts of the AIDET tool. We selected five cases—one for each component of the AIDET tool—for residents to learn how to effectively use this tool at the bedside. Faculty or senior residents served as group facilitators, and participants acted out the scenarios using the learned behaviors. We allotted 10-15 minutes at the end of the hour for closing comments or thoughts from the groups. The goal of this closing time was to allow people to discuss any identified awkwardness, successes, or difficulties while going through the cases. These simulated-case discussions were intended to help emulate and mitigate some of the barriers encountered during real patient interactions.

We scheduled a 30-minute block during the next month (September) for stories and reflection on Quarter 1. An open-floor discussion format was felt to be appropriate, as this allowed residents and faculty to share thoughts, successes, failures, and general observations after having implemented the material learned during the previous 2 months. Alternatively, a panel of residents could volunteer or be selected ahead of time to discuss their thoughts and share any anecdotes during this time.

Quarter 2 began in October. As with Quarter 1, a 30-minute time slot was selected during one of the resident conferences to deliver the second topical introductory lecture, “Therapeutic Interventions.” See Appendix G for this PowerPoint. Delivery of the content did not exceed 20-25 minutes in order to allow time for questions or comments.

The following month (November), a 1-hour block was used during resident conference to review the cases in small groups for Quarter 2. See Appendix H for these cases. We selected the cases in this quarter to reflect typical patient encounters revolving around pain management, common emergency department procedures, and navigating patient expectations. Residents could be assigned individual cases ahead of time (i.e., before conference) by PGY level or by groups. For example, PGY-1 residents could answer the first question in each case, PGY-2 residents could answer the second question in each case, and PGY-3/PGY-4 residents could answer the remaining questions in each case. Again, approximately 10-15 minutes were allowed at the end of the workshop for any closing thoughts or comments from the groups.

Quarter 2 ended the next month (December) with a 30-minute time slot during conference for residents and faculty to share stories and reflect on the Quarter 2 content. We again used an informal, open-floor format to allow everyone to reflect candidly on their successes as well as on barriers encountered in their own practice.

Quarter 3 began in January with the introduction to the topic “Communication and Closing the Encounter” during a 30-minute time slot. See Appendix I for this PowerPoint.

During the following month (February), a 1-hour block was scheduled during resident conference to discuss the cases in Quarter 3. See Appendix J for these cases. They were chosen to allow residents to practice closing various patient encounters, including how to discuss admission, discharge instructions, and return precautions with patients. Each group discussed all the cases as time allowed. We allotted time at the end for closing comments or thoughts from the groups.

The next month (March) closed Quarter 3 with a 30-minute period during resident conference for reflection and stories from the previous 2 months about communication and closing encounters.

The Quarter 4 topic, “Recap and Quality Care,” was delivered during a 30-minute conference time slot. See Appendix K for this PowerPoint. This topic allowed for a recap of the curriculum to date and touched on key points from the previous topics, including AIDET, how to discuss pain management with patients, and how to communicate with patients and close the emergency department encounter. This topic also served as an introduction to communicating to patients the quality of care being delivered to them.

In May, the cases for Quarter 4 were discussed and practiced in small groups during a 1-hour block during resident conference. See Appendix L for the Quarter 4 cases. These cases allowed for role-playing and were intended to highlight various ways to communicate quality care to patients and families in the emergency department in commonly encountered situations. PGY-1 residents were assigned the doctor role, PGY-2 residents were assigned the patient role, and PGY-3 residents were assigned the facilitator role. (For 4-year EM programs, PGY-4 residents could serve in the facilitator role. Faculty could also facilitate or be assigned to any other role.) Time was allotted at the end for final thoughts from the group.

Quarter 4 ended in June with a 30-minute time for reflection and stories from the previous 2 months regarding quality of care. This was also the final meeting of the curriculum, and the time was used to discuss practice changes or attitude changes among residents regarding the patient experience from the beginning of the curriculum up to this point.

Prior to the end of the academic year (at the beginning of Quarter 4), we administered the postintervention survey to the residents. See Appendix M for the postintervention survey. The postintervention survey reflected many of the same attitudes and behaviors as the preintervention survey. We sent this survey to the same individuals who received the preintervention survey: our EM residents, excluding faculty and students. Though the questions for the postintervention survey differed somewhat from preintervention, this was only because we were asked to address certain questions by an institutional task force working on this issue. We collected the survey data and calculated the mean responses to the survey items on a 5-point Likert scale (1 = completely agree, 5 = completely disagree), as well as determining the 95% CI.

Our intent was to generate a better understanding of our efforts and assess behavioral changes in residents’ approach to patient satisfaction and the patient experience. This was a quality improvement endeavor rather than primarily a research endeavor. We were not able to evaluate individual resident-specific satisfaction data or patient survey results for the reasons mentioned in the Introduction.


The pre- and postintervention surveys assessed residents’ attitudes and behaviors involving four areas of the patient experience that were highlighted in the curriculum: courtesy and respect, communication, therapeutic interventions, and quality care. Survey questions were selected to focus on these key areas of the patient experience. Twenty-four participants returned completed preintervention surveys out of 44 total invitations, for a response rate of 54.5%. Twenty participants returned completed postintervention surveys out of 44 total invitations, for a response rate of 45.5%. As noted, the surveys themselves can be found in Appendices C and M, and the overall results are listed in Table 1 and Table 2. As an example, on the postintervention survey, residents were asked to provide their level of agreement on a 5-point scale (1 = completely agree, 5 = completely disagree) with the following statement: “I am more knowledgeable about how to improve patient satisfaction in the ED.” The mean response to this question was 4.1, with a 95% CI of 3.7-4.4.

Table 1. Preintervention Survey Item Mean Responses With 95% CIs
M (95% CI)a
I sit at the bedside.
3.4 (3.0-3.7)
I discuss medication side effects.
2.6 (2.4-2.9)
I discuss wait time expectations.
3.2 (2.8-3.5)
I thank my patients.
2.5 (2.1-2.9)
I acknowledge all in room.
4.0 (3.7-4.4)
I give anticipated disposition.
3.5 (3.2-3.8)
Ninety seconds to tell story.
3.6 (3.2-4.0)
I explain exam findings.
3.0 (2.6-3.3)
I convey concern for patient comfort.
3.3 (3.1-3.6)
I convey concern for patient safety.
2.9 (2.6-3.2)
I give regular, timely updates.
3.3 (3.0-3.5)
Abbreviation: CI, confidence interval.
aOn a 5-point Likert scale (1 = never, 5 = always).
Table 2. Postintervention Survey Item Mean Responses With 95% CIs
M (95% CI)a
I ask about medication allergies.
3.7 (3.3-4.2)
My appearance affects patient satisfaction.
4.1 (3.8-4.4)
My attitude toward patient satisfaction has changed.
3.4 (2.9-3.8)
I assess patient expectations.
3.8 (3.5-4.0)
I discuss non-narcotic alternatives for pain.
4.3 (4.0-4.7)
I seek patient understanding of discharge instructions.
3.1 (2.7-3.5)
I speak highly of the medical team.
4.1 (3.8-4.4)
I communicate quality of care. 3.3 (2.8-3.7)
I am more knowledgeable about how to improve patient satisfaction in the ED.
4.1 (3.7-4.4)
I sit at the bedside.
3.9 (3.4-4.3)
I discuss medication side effects.
3.2 (2.7-3.6)
I discuss wait time expectations.
3.7 (3.3-4.1)
I thank my patients.
3.4 (2.8-3.9)
I acknowledge all in room.
4.4 (4.1-4.7)
I give anticipated disposition.
4.1 (3.8-4.4)
Ninety seconds to tell story.
3.8 (3.3-4.3)
I explain exam findings.
3.6 (3.1-4.1)
I convey concern for patient comfort.
3.3 (2.8-3.7)
I convey concern for patient safety.
3.3 (2.8-3.7)
I give regular, timely updates.
3.8 (3.5-4.1)
Abbreviations: CI, confidence interval; ED, emergency department.
aOn a 5-point Likert scale (1 = completely disagree, 5 = completely agree).


Our endeavor to improve the patient experience via improved provider interactions was relatively straightforward. We began with four topics: conveying courtesy and respect toward patients and their loved ones, therapeutic intervention education, discharge instructions education, and discussion about quality of care. Through our monthly sessions, we were able to introduce each topic, work through case-based scenarios in small-group settings, and engage in feedback and storytelling regarding implementation of the tools and concepts learned in the curriculum and then applied in daily patient care. Resident feedback suggested that our final discussion sessions were the most useful for sharing stories and discussing different methods of applying the materials.

The differences in the pre- and postintervention surveys limited our ability to compare (please see the Limitations and Reflections section below for further discussion of this). Within those limitations, the postintervention data show that residents provided generally positive responses for many of the most highly targeted behaviors, including sitting at the bedside, speaking highly of the medical team, providing an anticipated disposition, acknowledging all in the room, and discussing non-narcotic alternatives for pain.

This curriculum was primarily led by our residents, although it involved several of our faculty as well. Residents reported a general improvement in their overall fund of knowledge regarding improving patient satisfaction after the implementation of the curriculum. Our postimplementation experience is that frequent, real-time reinforcement is required to maintain a patient experience skill set. This is consistent with prior studies showing that refresher training after a structured curriculum improves patient satisfaction performance.5 Implementation barriers include clinical duties that prevent 100% participation in resident conferences. Anecdotally, retention and application of the curriculum tools were best among the more senior residents, who had longer exposure to the curriculum. Resident engagement was also improved by recent institutional and legislative efforts at curbing opiate prescribing as residents reported heavy reliance on the curriculum tools when discussing nonopiate alternatives to pain control. Moreover, summative and formative tools to assess resident performance are limited.

Going forward, we hope to improve on the curriculum by better capturing individual resident patient satisfaction data to improve summative feedback. Because patient interactions with nonphysician care providers are major determinants of overall patient experience, we plan to include participation from key shareholders such as nursing, technician, and administrative staff. Finally, future iterations of our curriculum may serve as a model for other training programs at our institution outside EM.

Limitations and Reflections
This was a single-institution intervention, and results elsewhere may vary based on regional and institutional practice idiosyncrasies. The curriculum has not been externally validated at another institution. The survey results give only our residents’ own impressions of their behavior and attitudes.

Our survey data were limited to our program’s residents and did not include our faculty; the data were also limited to those residents submitting a completed survey. The low response rate for each survey does raise concern about a sampling bias, and it is possible that more motivated learners are overrepresented in each sample. The data were not paired, and thus, we have not been able to make individual comparisons from before and after the intervention. We did not stratify resident PGY status during data analysis to determine variability of intervention effectiveness. We do believe that patient-centered data are important to determining the overall effectiveness of this educational initiative, and there will be opportunities in the future to collect such data. Preintervention and postintervention survey formats and question items differed, limiting our ability to interpret the data. The reason for this difference was that during the delivery of the curriculum, our institution implemented an effort to improve patient satisfaction in the emergency department; as a result, different domains of response were desired. For further implementation, using a single survey design with matching pre- and postintervention surveys would allow for more useful comparisons. We were also limited by not having access to resident-level patient satisfaction data. Though we could have compared before-and-after department-level patient satisfaction data, there were simply too many confounders at the time, including other interventions and changes to staffing models, for us to be able to attribute any change in patient satisfaction to our educational intervention. We also did not test knowledge acquisition by the participants, as our main goal was the affect change in behaviors and attitudes.

Regarding delivery of content, speakers for the introductory sessions for each quarter need to become familiar with the PowerPoint slides and the information to be presented. This may require some additional review of the topics to become comfortable teaching the material. It can be difficult to have all residents and faculty present for the curriculum sessions due to the nature of residency scheduling. However, other modalities for delivering content (e.g., posting the introduction sessions online for asynchronous learning by all providers) are options that may be explored in the future. We do feel strongly that the discussion sessions at the end of each quarter are incredibly valuable for sharing thoughts, ideas, and challenges in improving the patient experience in the emergency department.

Author Information

  • Jonah Gunalda, MD: Assistant Professor, Department of Emergency Medicine, University of Mississippi School of Medicine
  • Kathleen Hosmer, MD: Assistant Professor, Department of Emergency Medicine, Wake Forest Baptist Medical Center
  • Nicholas Hartman, MD: Assistant Professor, Department of Emergency Medicine, Wake Forest Baptist Medical Center
  • Lane Smith, MD, PhD: Assistant Professor, Department of Emergency Medicine, Wake Forest Baptist Medical Center
  • Bradley Chapman, MD: Resident, Department of Emergency Medicine, Wake Forest Baptist Medical Center
  • Warren Jones, DO: Resident, Department of Emergency Medicine, Wake Forest Baptist Medical Center
  • Michael Irick, MD: Resident, Department of Emergency Medicine, Wake Forest Baptist Medical Center
  • Manoj Pariyadath, MD: Assistant Professor, Department of Emergency Medicine, Wake Forest Baptist Medical Center

None to report.

None to report.

Ethical Approval
Reported as not applicable.


  1. Farley H, Enguidanos ER, Coletti CM, et al. Patient satisfaction surveys and quality of care: an information paper. Ann Emerg Med. 2014;64(4):351-357. https://doi.org/10.1016/j.annemergmed.2014.02.021
  2. Aiken LH, Sermeus W, Van den Heede K, et al. Patient safety, satisfaction, and quality of hospital care: cross sectional surveys of nurses and patients in 12 countries in Europe and the United States. BMJ. 2012;344:e1717. https://doi.org/10.1136/bmj.e1717
  3. Fullam F, Garman AN, Johnson TJ, Hedberg EC. The use of patient satisfaction surveys and alternative coding procedures to predict malpractice risk. Med Care. 2009;47(5):553-559. https://doi.org/10.1097/MLR.0b013e3181923fd7
  4. Manary MP, Boulding W, Staelin R, Glickman SW. The patient experience and health outcomes. N Engl J Med. 2013;368(3):201-203. https://doi.org/10.1056/NEJMp1211775
  5. Mayer TA, Cates RJ, Mastorovich MJ, Royalty DL. Emergency department patient satisfaction: customer service training improves patient satisfaction and ratings in physician and nurse skill. J Healthc Manag. 1998;43(5):427-441. https://doi.org/10.1097/00115514-199809000-00009
  6. Lau FL. Can communication skills workshops for emergency department doctors improve patient satisfaction? J Accid Emerg Med. 2000;17(4):251-253. https://doi.org/10.1136/emj.17.4.251
  7. London KS, Druck J, Silver M, Finefrock D. Teaching the emergency department patient experience: needs assessment from the CORD-EM task force. West J Emerg Med. 2017;18(1):56-59. https://doi.org/10.5811/westjem.2016.9.30667
  8. Niedermier J. Understanding patient experience: a course for residents. MedEdPORTAL. 2017;13:10558. https://doi.org/10.15766/mep_2374-8265.10558


Gunalda J, Hosmer K, Hartman N, et al. Satisfaction Academy: a novel residency curriculum to improve the patient experience in the emergency department. MedEdPORTAL. 2018;14:10737. https://doi.org/10.15766/mep_2374-8265.10737

Received: February 8, 2018

Accepted: July 13, 2018