Original Publication
Open Access

Learning to Beat the Shock Clock: A Low-Fidelity Simulation Board Game for Pediatric and Emergency Medicine Residents

Published: February 11, 2019 | 10.15766/mep_2374-8265.10804


  • Beat the Shock Clock! Game.docx
  • Vital Signs Cards.pptx
  • Stanley Labs.pptx
  • Sarah Labs.pptx
  • Labels.docx
  • Stanley (Cold Shock) Case.docx
  • arah (Warm Shock) Case.docx
  • Stanley (Cold Shock) Checklist.docx
  • Sarah (Warm Shock) Checklist.docx
  • Beat the Shock Clock! Quiz.docx
  • Stanley Debrief.doc
  • Sarah Debrief.doc
  • Resource Guide.docx
  • Sarah (Warm Shock) Case.docx

All appendices are peer reviewed as integral parts of the Original Publication.

To view all publication components, extract (i.e., unzip) them from the downloaded .zip file.


Introduction: Resident physicians may have difficulty with identifying and managing pediatric septic shock due to limited patient encounters. Simulation-based interventions can enhance competency. We developed a low-fidelity tabletop simulation game to teach pediatric septic shock and compared residents’ knowledge of and comfort with recognition and management of septic shock. Methods: Pediatric and emergency medicine residents participated in an education session involving a low-fidelity, tabletop simulation in which they managed two simulated pediatric patients with septic shock. The two patients were a 12-year-old healthy male with cold shock due to a urinary tract infection and a 5-year-old female with a history of leukemia who developed warm shock due to pneumonia. Because this session was presented as a board game rather than high-fidelity simulation, learners focused on decision making rather than the mechanics of procedures. Residents completed a survey and a knowledge-based test before and after this session. Results: Twenty-three pediatric and nine emergency medicine residents participated. Correct responses for the preintervention test were 71%, compared with 83% postintervention. The difference in rates was 12% (95% confidence interval, −0.17 to −0.07; p < .0001). Residents rated this modality as being more useful than lectures or reading and as equivalent to bedside teaching and high-fidelity simulation. Discussion: Our pilot low-fidelity simulation improved resident knowledge and comfort with pediatric septic shock care. Further studies are needed to address the impact of low-fidelity simulations on patient outcomes.

Educational Objectives

By the end of this activity, learners will be able to:

  1. Identify pediatric septic shock.
  2. Distinguish cases of warm and cold septic shock, as well as management issues related to each.
  3. Interpret vital signs and lab values as they relate to pediatric septic shock.
  4. Manage simulated cases of pediatric septic shock.


Pediatric severe sepsis and shock are a complex, dynamic process leading to high morbidity and mortality.1 Prevalence in children is fairly uncommon in the United States, however, with estimates at 0.56-0.89 cases per 1,000 and mortality of 8.9%.2 Additionally, pediatric septic shock may progress quickly, and rapid recognition and intervention are critical to effective treatment of this condition.1 Because of limited patient encounters, many resident physicians may have difficulty with recognition and management of pediatric severe sepsis and shock. Consequently, academic institutions are challenged with how to most effectively teach correct diagnosis and medical management skills of this relatively rare condition.

Simulation is a popular, well-accepted training modality for teaching residents.3 Although there are various definitions, simulation can be broadly defined as “any educational activity that utilizes simulation aides to replicate clinical scenarios.”4,5 Simulation-based teaching sessions can serve as effective training methods to enhance medical knowledge, improve competency, and generate diagnostic confidence.4,5 Simulation is an ideal adjunct for resident physicians to learn management of low-frequency but high-intensity conditions that they might not sufficiently encounter during training. Thus, simulation can be an ideal teaching method to improve residents’ ability to identify and competently manage pediatric severe sepsis and shock.

Although high-fidelity simulation is the most common type of simulation used in medical education, low-fidelity simulation can also be highly effective. This type of simulation is an option that is less lifelike but requires significantly less monetary and time investment. Low-fidelity scenarios are less detailed and use simple mannequins or prompts. A greater focus may be placed on medical management and cognitive decision making as opposed to procedural practice. Low-fidelity simulation can be performed in a variety of physical settings with only a single academic facilitator. At the same time, it provides more structure and requires more action on the part of the learner than simple discussion of cases in a small-group setting. The low-fidelity training method may therefore be more practical in many situations for physicians in training. One type of low-fidelity simulation that can be used is a tabletop simulation. Here, cases are presented in a board-game format, emphasizing decision making and thought process over procedures. This has been shown in prior studies to be effective for teaching the management of rare events.6

In devising this tabletop simulation game, our goal was to create a learning resource that was highly interactive and focused on critical decisions in pediatric septic shock. Additionally, given the expense and time requirements of high-fidelity simulation, we wanted to create a low-cost resource that could be used multiple times with different groups of learners. In reviewing available resources, we found several high-quality simulation scenarios focusing on pediatric septic shock.7-9 These resources are targeted at high-fidelity simulation, however, and would require updates to the case flow to be adapted to a low-fidelity simulation scenario. Here, we provide all information and resources needed to create low-cost tabletop simulation games that can be used many times and do not require access to a well-equipped simulation center. This represents a novel approach to teaching this important topic that is both interactive and effective and requires fewer resources than traditional simulation.

Although the use of tabletop simulation in teaching pediatric septic shock is novel, this type of educational session has been used effectively in the past. At our institution, low-fidelity simulation was used effectively for teaching management of pediatric patients in an earthquake disaster scenario.6 That simulation also represented an uncommon but high-stakes scenario with a focus on decision making. This positive prior experience was also important in our decision to create a tabletop game focusing on the management of pediatric septic shock.

As this tabletop simulation represents a novel approach to teaching pediatric septic shock, we were interested in the effectiveness of the teaching method, as well as resident perception of low-fidelity simulation. Our study objectives, therefore, were to measure pediatric and emergency medicine residents’ knowledge of and comfort with recognition and management of septic shock before and after two novel low-fidelity simulation sessions, as well as residents’ perceptions of this educational session.


The simulation included interactive group management of two patient cases, one cold shock case and one warm shock case. The primary learning objectives for each case were differentiating compensated from decompensated shock, understanding the key features of warm and cold shock, and appropriately applying fluid resuscitation, antibiotic therapy, and vasopressor support. The simulation exercise was designed as a two-dimensional board game, similar to a felt-sticker board game. Our game included a picture of a patient placed centrally on the board, with spaces along the sides of the patient where the simulation team could place labels for chosen care tests and interventions. The side panels of the game held a word bank of possible labels for choices of laboratory studies, imaging, procedures, intravenous fluids, antibiotics, and vasopressors. An example of the game board during a simulation session can be seen in Appendix A.

Prior to the start of the simulation, the residents were oriented to the simulation game. Participants were asked to work collaboratively in making management decisions as opposed to assigning care team roles for each individual. During the simulation, participants were able to select interventions, such as starting an IV, administering a fluid bolus, or administering oxygen, among other options. As a result of these interventions, or of a failure to perform critical interventions, participants would see a change in patient status and vital signs. For example, in one patient, administration of a fluid bolus resulted in decrease in heart rate and increase in blood pressure. Detailed directions for how to create and play this board game can be found in the Beat the Shock Clock! Game document (Appendix A).

Our board game used the following items:

  • Two miniature three-part corrugated display boards (22 inches × 14 inches).
  • Hook and loop (Velcro) strips and dots.
  • Copy paper in two to three colors.
  • Laminator (or lamination can be done at a commercial copy/print center).
  • Construction or backing paper to decorate board (optional).
  • Card files (vital signs: Appendix B; Stanley labs: Appendix C; Sarah labs: Appendix D).
  • Labels (Appendix E).

Directions for assembly of the game board are available in Appendix A.

This tabletop simulation can be run with a single facilitator. In implementing the game, we typically had a facilitator and a faculty observer to assist as needed and ensure consistent teaching across facilitators.

We developed two simulation scenarios that presented a case of cold shock (Stanley: Appendix F) and a case of warm shock (Sarah: Appendix G). The simulations were first performed with four groups of five medical students (N = 20) from an emergency medicine interest group. After these pilot sessions, medical students were immediately asked for feedback on clarity and ease of understanding of the scenarios. The actual study group sizes were limited to fewer than five resident participants. Residents were split into small groups during usual conference time (emergency medicine) or by signing up for afternoon sessions shortly after usual noon conference (pediatrics). The order of the two scenarios was alternated between groups. Each session was facilitated by one of the three resident physician investigators and was additionally observed by one of two Pediatric Emergency Department (PED) attending faculty members. The three resident investigators and two PED faculty members attended a study orientation training session and then the pilot sessions prior to participating with the actual study groups.

We used checklists specific to each case to assist in evaluation of the management of the case (Stanley: Appendix H and Sarah: Appendix I). We assessed resident knowledge on key steps for delivery of early goal-directed therapy for pediatric severe sepsis and shock using a knowledge-based test administered before and after the simulations. Test questions were adapted from previously published board review textbooks and websites.10,11 We then pilot-tested, refined, and approved these questions for use by the same panel of pediatric intensive care unit (PICU) and PED faculty. The test itself consisted of 11 vignette-based multiple-choice questions. The questions highlighted the key steps for delivery of early goal-directed therapy for pediatric septic shock according to American College of Critical Care Medicine (ACCM) guidelines.1 Pre- and posttest questions were identical, and participants were blinded to their pretest scores. The questions are available in the Beat the Shock Clock! Quiz document (Appendix J).

We held debriefing sessions immediately following the simulations. Debriefings were facilitated by one of the resident physician investigators and observed by one of the PED faculty facilitators. The debriefing sessions included a scripted lesson on septic shock and reviewed the guidelines and recommendations for the management of pediatric septic shock presented by the ACCM.1 The scripted portion of the debriefing reemphasized the learning objectives for each case. After each case, its checklist was also reviewed, with particular attention paid to critical actions. In addition to the scripted component, there was also time allowed for feedback from the observers regarding participants’ management of the simulated patients and for questions from the learners. Structured debriefings are available for the case of cold shock (Stanley) in Appendix K and for the case of warm shock (Sarah) in Appendix L. A resource guide (Appendix M) is provided for learners who wish to learn more about management of pediatric septic shock.


We administered the pre- and postintervention knowledge-based tests and surveys and stored results via the REDCap online survey tool (Nashville, TN). Test and survey responses were linked to individual participants, but investigators were blinded to the results and responses for individual participants. Through REDCap, we emailed requests to complete the preintervention test and survey at 2 weeks and 1 week prior to the simulation sessions. Requests to complete the postintervention test and survey were made at similar intervals after the simulation session. All simulation sessions took place during a 2-week period. There were no other resident education interventions pertaining to septic shock during this time frame or the month afterward.

Sixty-nine residents were invited to participate (see the Figure). Forty-four residents attended the low-fidelity teaching session, and 32 (73%) completed the postteaching survey and test. This included 23 pediatric and nine emergency medicine residents. The majority of residents had managed two or fewer pediatric shock patients, had completed one or fewer pediatric intensive care rotations, and had completed two or fewer pediatric emergency medicine rotations (Table 1).

Participant inclusion flow diagram.
Table 1. Pediatric Shock and Critical Care
Experience of Residents
Resident Experience
Patients treated for shock
Pediatric intensive care rotations
Pediatric emergency medicine rotations

On the preteaching survey, of the seven survey questions regarding resident comfort level with recognizing and managing shock, two questions had a mean score of 3.0 or higher (comfortable), and five had a mean score of less than 3.0 (uncomfortable). For the postteaching survey, all seven questions had a score greater than 3.0. The mean score for the seven questions combined was 2.9 preteaching versus 3.7 postteaching (difference of the means = 0.8; 95% confidence interval [CI], 0.6-0.9; p < .001; see Table 2).

Table 2. Resident Comfort Level Before and After Intervention
Preintervention M (SD)a
Postintervention M (SD)a
How comfortable are you with identifying early stage of septic shock (compensated) in a pediatric patient?
How comfortable are you with identifying the late stages of septic shock (decompensated) in a pediatric patient?
How comfortable are you with identifying warm shock vs. cold shock in a pediatric patient?
How comfortable are you with identifying/recognizing septic shock in a pediatric patient based off of physical exam and vital signs?
How comfortable are you with the early management of pediatric patients in septic shock, including fluid resuscitation and antibiotic administration?
How comfortable are you with initiating ionotrope/vasopressor support for pediatric patients in warm shock?
How comfortable are you with initiating ionotrope/vasopressor support for pediatric patients in cold shock?
aResponses are based on a 5-point Likert scale (1 = not at all comfortable, 5 = extremely comfortable).

The overall rate of correct answers for the pretest questions was 71%, compared with a posttest rate of 83%. A paired t test was used to compare scores before and after the intervention. The mean difference in scores was 12% (95% CI, −0.17 to −0.07; p < .0001). Although the Kirkpatrick level of evaluation for this intervention is low, the result does demonstrate improvement in knowledge following the intervention.

The 12 residents who did not complete the postteaching survey or test questions had mean comfort scores not significantly different from the 32 residents who completed the pre- and postteaching questions (2.8 vs. 2.9, respectively; mean difference = 0.1; 95% CI, −0.09 to 0.30; p = .2).

We also compared participants’ perception of the teaching effectiveness of the low-fidelity method used in this study to that of didactic, reading, bedside, online learning, and high-fidelity methods. The majority of residents rated the low-fidelity option as being more effective than didactic, reading, and online methods. The majority rated low-fidelity simulation as being equal to bedside teaching and high-fidelity simulation (Table 3).

Table 3. Low-Fidelity Teaching Compared to Other Methods
Teaching Method
No. (%)
Low-Fidelity Simulation More Effective
Low-Fidelity Simulation Equally Effective
Low-Fidelity Simulation Less Effective
27 (84%)
5 (16%)
0 (0%)
24 (80%)
6 (20%)
0 (0%)
Bedside teaching
7 (22%)
18 (56%)
7 (22%)
22 (71%)
9 (29%)
0 (0%)
High-fidelity simulation 5(16%) 22(69%) 5(16%)
aNo response from two participants.
bNo response from one participant.


In developing the Beat the Shock Clock! game, we hoped to create an educational resource that had many of the benefits of traditional simulation while requiring significantly fewer resources. Based on the results of the pre- and postintervention quizzes and resident surveys, our low-fidelity simulation was effective in achieving this goal.

Pediatric septic shock is an uncommon but critical illness, and in our experience, residents generally do not encounter this disease process frequently during their residency. Resident responses to the preintervention survey confirmed that many of them had only limited exposure to pediatric patients in septic shock. It is not surprising, therefore, that they also reported low levels of confidence in the management of this disease. For this reason, it is critical that residents receive education in this area beyond what they experience in the PED and PICU. In our comparison of pre- and postintervention scores on the knowledge-based test, participants demonstrated significant improvement following the intervention. In addition, residents reported increased confidence in their ability to manage pediatric septic shock and rated this teaching method as effective when compared to traditional methods of learning in residency.

In addition to improving resident knowledge, this low-fidelity simulation was interactive and fun for the learners. Specifically, the style of our low-fidelity simulation was unique in that it looked like a board game and resident participants were asked to treat the sessions as such. Additionally, resident participants were asked to work collaboratively as a team to make decisions. We believe that we created a comfortable learning environment based upon predominantly favorable reviews of this teaching in the survey. This seemed to create greater focus on decision making and core content. Such simplicity has been cited in the past as a potential advantage of low-fidelity simulation, and that has been our experience in creating and implementing this resource as well.4,5,12

Our study has several important limitations. First, there was a high dropout rate among participants. Although the baseline characteristics of the residents who failed to complete the posttest were similar to those who did complete the study, it is possible that the former group did not have similar levels of improvement in their knowledge or confidence. With regard to the survey questions and knowledge test, the questions were developed by pediatric emergency medicine faculty with content expertise and pilot-tested with pediatric emergency medicine and PICU faculty. A formal assessment of survey reliability and validity was not performed, however. Additionally, only a single posttest was given following the intervention, so we were unable to assess knowledge retention following this intervention. Finally, we were unable to assess impact on clinical practice due to the low frequency of pediatric septic shock presentation and attending physician impact on clinical management of these patients in the clinical setting.

In this study, we looked specifically at the use of a low-fidelity simulation game in teaching the management of pediatric septic shock to residents in pediatrics and emergency medicine. We hope to run the cases and debriefing for residents in the future with minimal change, although the cases may be updated as guidelines regarding the management of septic shock continue to be refined. In the future, we believe this model could be expanded to other groups of learners and other disease processes. The cases can be easily modified to accommodate learners of different levels without significantly changing the game board or requiring additional resources. For example, in the warm shock case, the patient requires intubation. We expect residents in emergency medicine to articulate what medications and doses should be used. A medical student on a clinical rotation, in contrast, might be expected only to know that intubation is required. Additionally, the basic structure of the game board is very versatile and could be used in the future as a setting for a variety of cases that represent different disease processes.

In conclusion, this pilot low-fidelity tabletop simulation resulted in a modest increase in overall knowledge of pediatric septic shock diagnosis and care. Residents reported significant improvement in their comfort with diagnosis and management of pediatric septic shock. They also rated the tabletop method of teaching as being as effective as bedside teaching and high-fidelity simulation and more effective than lectures or reading. This degree of resident acceptance strongly argues for the incorporation of low-fidelity simulations into educational curricula. Further investigation is needed to determine the most applicable setting and frequency for use of low-fidelity simulation.

Author Information

  • E. Page Bridges, MD: Assistant Professor, Department of Emergency Medicine, University of South Carolina School of Medicine Greenville; Assistant Clerkship Director, Department of Emergency Medicine, University of South Carolina School of Medicine Greenville
  • Catherine E. Foster, MD: Assistant Professor, Department of Pediatrics, Section of Infectious Diseases, Baylor College of Medicine; Assistant Professor, Department of Pediatrics, Section of Infectious Diseases, Texas Children’s Hospital
  • Dan B. Park, MD: Assistant Professor, Department of Pediatrics, Division of Pediatric Emergency Medicine, University of North Carolina Children’s Hospital; Director of Pediatric Emergency Ultrasound, Department of Pediatrics, Division of Pediatric Emergency Medicine, University of North Carolina Children’s Hospital; Associate Medical Director for Pediatric Emergency Medicine, Department of Pediatrics, Division of Pediatric Emergency Medicine, University of North Carolina Children’s Hospital
  • Kathy L. Lehman-Huskamp, MD: Associate Professor, Department of Pediatrics, Medical University of South Carolina College of Medicine; Director of Emergency Management, Department of Pediatrics, Division of Pediatric Emergency Medicine, Medical University of South Carolina College of Medicine
  • Dan W. Mark, MD: Assistant Professor, Department of Pediatrics, University of South Dakota, Sanford School of Medicine
  • Rachel E. Tuuri, MD: Associate Professor, University of New Mexico School of Medicine; Clinical Director, Pediatric Emergency Department, University of New Mexico School of Medicine; Division Chief, Pediatric Emergency Medicine, Department of Emergency Medicine, University of New Mexico School of Medicine

We acknowledge Dr. Joseph D. Losek for assistance in data analysis and manuscript preparation.

None to report.

None to report.

Prior Presentations
Foster CE, Mark DW, Park DB, Bridges EP, Lehman-Huskamp KL, Tuuri RE. Learning to beat the shock clock: a low-fidelity simulation intervention for pediatric and emergency medicine residents. Poster presented at: Section on Emergency Medicine, 2014 American Academy of Pediatrics National Conference and Exhibition; October 11-14, 2014; San Diego, CA.

Ethical Approval
The Institutional Review Board at the Medical University of South Carolina approved this study.


  1. Brierley J, Carcillo JA, Choong K, et al. Clinical practice parameters for hemodynamic support of pediatric and neonatal septic shock: 2007 update from the American College of Critical Care Medicine. Crit Care Med. 2009;37(2):666-688. https://doi.org/10.1097/CCM.0b013e31819323c6
  2. Hartman ME, Linde-Zwirble WT, Angus DC, Watson RS. Trends in the epidemiology of pediatric severe sepsis. Pediatr Crit Care Med. 2013;14(7):686-693. https://doi.org/10.1097/PCC.0b013e3182917fad
  3. Perkins GD. Simulation in resuscitation training. Resuscitation. 2007;73(2):202-211. https://doi.org/10.1016/j.resuscitation.2007.01.005
  4. Al-Elq AH. Simulation-based medical teaching and learning. J Family Community Med. 2010;17(1):35-40. https://doi.org/10.4103/1319-1683.68787
  5. Lopreiato JO, Sawyer T. Simulation-based medical education in pediatrics. Acad Pediatr. 2015;15(2):134-142. https://doi.org/10.1016/j.acap.2014.10.010
  6. Whitney RE, Burke RV, Lehman-Huskamp K, Arora G, Park DB, Cicero MX. On shaky ground: learner response and confidence after tabletop earthquake simulation. Pediatr Emerg Care. 2016;32(8):520-524. https://doi.org/10.1097/PEC.0000000000000681
  7. Adler M, Trainor J, Siddall V, McGaghie W. Pediatric acute care simulator cases - shock (sepsis). MedEdPORTAL. 2008;4:821. https://doi.org/10.15766/mep_2374-8265.821
  8. Heitz C. Pediatric medical resuscitation: a simulation curriculum. MedEdPORTAL. 2010;6:7829. https://doi.org/10.15766/mep_2374-8265.7829
  9. Sagalowsky S, Boyle T, Winn A, et al. Four core cases: a simulation curriculum for pediatrics residents. MedEdPORTAL. 2014;10:9943. https://doi.org/10.15766/mep_2374-8265.9943
  10. The Pediatrics Review and Education Program (PREP). Pediatr Rev. 1990;11(7):195-196. http://dx.doi.org/10.1542/pir.11-7-195-a
  11. Wang VJ, Flood RG, Sharma S, eds. Pediatric Emergency Medicine Question Review Book 2013. 2nd ed. Raleigh, NC: PEMQBook; 2012.
  12. Maran NJ, Glavin RJ. Low- to high-fidelity simulation—a continuum of medical education? Med Educ. 2003;37(suppl 1):22-28. https://doi.org/10.1046/j.1365-2923.37.s1.9.x


Bridges EP, Foster CE, Park DB, Lehman-Huskamp KL, Mark DW, Tuuri RE. Learning to beat the shock clock: a low-fidelity simulation board game for pediatric and emergency medicine residents. MedEdPORTAL. 2019;15:10804. https://doi.org/10.15766/mep_2374-8265.10804

Received: August 15, 2018

Accepted: December 25, 2018