Original Publication
Open Access

A Global Rating Scale and Checklist Instrument for Pediatric Laceration Repair

Published: February 27, 2019 | 10.15766/mep_2374-8265.10806


  • Laceration Checklist.docx
  • Global Rating Scale.doc
  • Laceration Supplies Checklist.docx
  • GRS Evaluation Form.docx
  • Clinical Scenario.docx
  • Laceration Repair GRS Training Video.mp4

All appendices are peer reviewed as integral parts of the Original Publication.

To view all publication components, extract (i.e., unzip) them from the downloaded .zip file.


Introduction: Laceration repair is a core procedural skill in which pediatric residents are expected to attain proficiency per the Accreditation Council for Graduate Medical Education. Restricted trainee work hours have decreased clinical opportunities for laceration repair, and simulation may be a modality to fill that clinical gap. There is a therefore a need for objective measures of pediatric resident competence in laceration repair. Methods: We created a global rating scale and checklist to assess laceration repair in the pediatric emergency department. We adapted the global rating scale from the Objective Structured Assessment of Technical Skills tool used to evaluate surgical residents’ technical skills and adapted the checklist from a mastery training checklist related to infant lumbar puncture. We tested both tools in the pediatric emergency department. Eight supervising physicians used the tools to evaluate 30 residents’ technical skills in laceration repair. We performed validation testing of both tools in the simulation environment. Based on formal evaluation, we developed a video to train future evaluators on the use of the global rating scale. Results: The global rating scale and checklist showed fair concordance across reviewers. Both tools received positive feedback from supervising physicians who used them. Discussion: We found that the global rating scale and checklist are more applicable to formative, rather than summative, training for resident laceration repair. We recommend using these educational tools with trainees in the simulation environment prior to trainees performing laceration repairs on actual patients.

Educational Objectives

By using these tools, facilitators will be able to evaluate learners in the following areas:

  1. Preparing for a laceration repair procedure, including assembling necessary and appropriate equipment and properly positioning patients.
  2. Executing the laceration repair procedure completely and efficiently, demonstrating knowledge of each step in the procedure.
  3. Exhibiting technical proficiency in the laceration repair procedure.


The Accreditation Council for Graduate Medical Education has developed milestones for trainees that include procedural or technical skills under the core competency of patient care.1 The Pediatric Resident Review Committee recommends that laceration repair is a procedure pediatric residents should receive training in and that progress in competency should be monitored by pediatric residency programs.2 In a survey of pediatric residency graduates from a large primary and tertiary care teaching hospital, 92% of graduates responded that they were “adequately trained” to perform suturing for laceration repair. However, 39% of respondents reported they had performed less than one laceration repair with suturing, and 27% reported they had never performed a laceration repair.3 Additionally, procedural experience, the traditional evidence of procedural competence provided through procedure logs, has not been shown to ensure competence.4 Thus, there is an increasing need for residency programs to objectively evaluate residents’ technical skills in laceration repair.

This mandate to evaluate procedural competency arrives in a climate of decreasing opportunities for trainees to practice skills in the clinical setting. Work hour restrictions and changes in medicine have greatly decreased learners’ opportunities to learn by doing.5 Surveyed pediatric residency directors believe that many residents fail to achieve competence in nine of 13 “very important” procedures, with 36% of directors stating that not all their residents achieve competence in laceration repair.6

This has led to the development of alternative modalities for procedural education, such as simulation training and just-in-time procedural practice. To this end, we developed a novel global rating scale and checklist for laceration repair. The global rating scale is based on the Objective Structured Assessment of Technical Skills (OSATS) tool used to evaluate surgical residents’ technical skills. Modified OSATS tools have been designed and validated to evaluate trainee performance in neonatal lumbar punctures and obstetrical procedures.7-9 The modified OSATS tool evaluating competency at neonatal lumbar puncture was validated for “content, response process, and interrater reliability.”7 As residents progressed in their training, the OSATS tool reflected their increased competence in lumbar puncture. Similarly, our global rating scale may be useful for formative assessment of pediatric residents’ technical skills in laceration repair, particularly for just-in-time training.

The training checklist is used to determine if the trainee properly prepares for and performs laceration repair by checking off whether the trainee independently and correctly performs specified tasks or not. The global rating scale, on the other hand, allows the observer to evaluate how well, on a scale from 1 to 5, the trainee performs various steps in laceration repair. While some researchers have concluded that using global rating scales is preferable to using a checklist,10 others have demonstrated the value of using both.11,12 Because of the binary nature of a checklist, it provides “specific, concrete feedback” to the learner.13 We propose the use of the checklist in addition to the global rating scale for formative assessment; however, if time is limited, the global rating scale can be used alone. Procedural checklists have been developed to assess learners’ skills in various procedures, but no procedural checklist for laceration repair currently exists in the MedEdPORTAL literature to our knowledge.14-27


As pediatric emergency medicine (PEM) providers at an academic institution with multiple visiting residents, we know that it can be difficult to assess trainees’ procedural skills. We therefore aimed to create a way for supervising PEM physicians (attendings and fellows) to train and assess trainees’ technical skills. To do so, we developed a modified OSATS.

While we adapted the present tools from others developed to evaluate surgical skills, the items on our tools were grounded in skills appropriate for the pediatric emergency department. We applied the framework from previously published OSATS tools to develop our tools for laceration repair. The global rating scale was primarily adapted from a previously published and validated surgical global rating scale.10 Common concepts between surgical procedures and laceration repair procedures, such as efficiency in motion and procedural knowledge, were retained, while concepts unique to surgery were discarded and replaced with skills specific to laceration repair. We adapted the checklist from a previously validated checklist for pediatric lumbar puncture technique.10 The two checklists were similar in their detailed descriptions of procedural steps as well as their emphasis on the entirety of the procedure, including field preparation and the discarding of sharps. We based specific content points for the laceration repair procedure, such as how to hold and use equipment, skin entry technique, and suture tail length, on the New England Journal of Medicine’s video on basic laceration repair.28 We also consulted other sources on pediatric laceration repair to validate the use of specific techniques and add additional content, such as volume of water for wound irrigation.29,30 Once the tools had been developed, independent PEM physicians reviewed them for content validity and then piloted them in the clinical setting. We used the resulting feedback to further refine the tools.

As in the surgical OSATS, the two assessment tools serve different, complementary purposes. For the checklist, the emphasis is on performing the correct steps of the procedure in the correct sequence. The global rating scale, on the other hand, emphasizes technique. To achieve a high score, a procedure needs to be performed not only correctly but also adeptly. We adapted both the 5-point scale and many of the descriptors used in each item from the original OSATS tool. We based individual items, particularly pre- and postsuture placement, on the basic laceration repair resources described above.

Prior to implementation, we performed a formal assessment of the validity of both the global rating scale and the checklist through video recordings of residents performing simulated laceration repair procedures. We recruited eight emergency and PEM attendings who had an interest in simulation or medical education and who regularly performed and supervised laceration repair in pediatric patients to participate in the validation study as video reviewers. Between January 2016 and November 2016, we filmed 29 pediatric residents and one family medicine resident at a tertiary care center performing simulated laceration repairs on Simulab Tissue Suture Pads (SKU:TSP-10). The residents were of different training levels, including 14 interns, four second-year residents, and 12 third-year residents. Irrigation supplies, suturing tools, and suture material came from repurposed clinical materials. Each video was evaluated by five of the eight emergency or PEM attending physicians using the checklist and the global rating scale for laceration repair. Evaluators were blinded to the identity of the proceduralist. To estimate agreement between the evaluators, we calculated concordance correlation coefficients (CCCs) for both the global rating scale and checklist scores. We calculated average scores for each resident and compared scores across levels of training and laceration repair experience. For each item, we calculated score ranges for each proceduralist and then calculated the median range for each item.

We emailed instructions to supervising physicians regarding how to use the checklist (Appendix A) and the global rating scale (Appendix B) and encouraged them to contact us if questions arose. A list of necessary materials for practicing laceration repair is included in Appendix C. We sent an evaluation form for the global rating scale (Appendix D) to supervising physicians who used the scale. Instructions for the learner, including a clinical scenario, are included in Appendix E. The expectation is that the supervising physician should have knowledge of laceration repair and be considered a technical expert. While evaluators should be familiar with laceration repair, they may need to be oriented to specific criteria of the checklist and global rating scale used to evaluate procedural competency. This can be done via the training video, which was developed based on evaluations of the global rating scale from supervising physicians who used it (Appendix F).


Eight pediatric emergency department and emergency department supervising physicians (attendings and fellows) used the tools to assess video recordings of 30 resident trainees performing laceration repair procedures in the simulation environment. Fourteen interns, four second-year residents, and 12 third-year residents participated in the study. All participants were pediatric residents, with the exception of one family medicine resident.

Our calculated CCCs showed fair concordance across reviewers for both the checklist (0.55; 95% confidence interval [CI], 0.38-0.69) and the global rating scale (0.53; 95% CI, 0.36-0.67). There was no statistically significant difference in either global rating scale or checklist scores by years of training or procedural experience. However, the effect size of years of training as calculated by Cohen’s d was moderate for both the global rating scale and the checklist (Table 1 & Table 2). We also noted that the poorest performers (Remedial tier as per Table 3) tended to have low ratings across reviewers.

Table 1. Mean Resident Score for Each Component of the Modified Objective Structured Assessment
of Technical Skills by Level of Training
M (SD)
Cohen’s d
Interns (n = 15)
Seniors (n = 15)
Global rating scale (maximum score = 60) 45.3 (7.2) 48.5 (4.0) .15 0.56
Checklist (maximum score = 18)
12.8 (2.3)
14.2 (1.9)
Table 2. Change in Resident Score on Each Component of the Modified Objective
Structured Assessment of Technical Skills by Procedural Experience
Score Increase
95% Confidence Interval
Increase in score for every 10 previous laceration repair procedures performed:
    Global rating scale (maximum score = 60)
−0.1, 6.1
    Checklist (maximum score = 18)
−0.4, 1.9
Table 3. Suggested Tiering of Individual Global Rating Scale Scores
Mean Question Score at Cutoff
High Performance
High performance on most items
Good Performance
High performance on some items
Understands the basics but needs oversight on most items
Needs oversight/instruction on most items

Supervising users evaluated the tools using a 5-point Likert scale (1 = strongly disagree, 2 = disagree, 3 = neutral, 4 = agree, 5 = strongly agree), with a response rate of six out of 10 users. Most users found the tools to be useful, with a median score of 5.0 for “The rating scales assessed areas that I typically assess when determining trainee level of proficiency with laceration repair.” The median score for determining trainee proficiency and use of the tools for summative assessment was 4.0. The median Likert scale score was 4.5 in ease of using the tools for formative assessment. In general, most users felt the checklist was useful as a framework but wanted more training on use of the tools. Some specific examples of participant responses to our prompts included the following:

  • List/describe one or more ways the rating scales will change how you do your job:
    • “Liked the order of the OSATS checklist and will help me with my mental checklist when teaching trainees.”
    • “Helpful to have a summary of steps to assess the trainee.”
    • “Provides a nice framework for discussing proper technique with trainees.”
    • “Helpful rubric to have steps for performance and for prep.”
    • “Nice to break down prep and procedure.”
    • “Helpful with newer trainees.”
  • How we could improve the rating scales:
    • “OSATS is more user friendly and intuitive.”
    • “Suggest training the trainer on the scales.”
    • “Consider including the new EPA approach.”
    • “Technique of holding instruments may vary with similar outcomes.”
    • “Few items (can’t recall which) seem institution specific.”
    • “Consider combining the tools into one.”
  • Comments:
    • “Although the 5-point checklist is a little more work to use, it allows for more refined feedback.”
    • “Overall, I preferred the global ratings scale.”


We created a novel resource, including a checklist and a global rating scale, to evaluate trainees’ competence in laceration repair in the emergency department setting. We adapted an original OSATS grounded in widely accepted techniques for performing simple laceration repair. We chose the OSATS model because of its use with surgical trainees’ technical skills and because similar checklists have been utilized with pediatric lumbar puncture.7 Once developed, the tools were refined based on the input of various PEM content experts who used them in clinical practice. Originally, we intended the tools to be used for summative assessment; based on feedback and our validity process, we now recommend them for formative rather than summative assessment.

These tools were piloted in the clinical environment and evaluated in the simulation environment. The advantage of using the global rating scale in the clinical setting is that the proceduralist is truly performing the actual procedure, as opposed to a simulation environment, which merely approximates the procedure. However, there are barriers to using these tools in the clinical environment as it may be awkward to complete a checklist in front of a patient/family. We therefore encourage evaluators to complete the forms after the procedure, outside of the patient room, if using them in the clinical setting. There are a number of disadvantages to completing the tools postprocedure, including inability to give real-time feedback and delays in form completion leading to inaccurate assessment. The benefits of using the assessment in the simulation environment include the ability to complete the tools in real time, the opportunity to utilize formative assessment and give feedback during the procedure, and the opportunity for learners to practice discussed skills.

The global rating scale and checklist are useful for providing a formative assessment of trainees’ technical skills for laceration repair. They not only help identify which trainees have not yet achieved competence in laceration repair but also reveal specific areas of laceration repair in which a trainee needs further instruction and experience (e.g., learners who scored in the Remedial tier for one reviewer tended to receive similar scores from other reviewers). After using these assessment tools, supervising physicians can then target different steps in laceration repair to maximize learning for individual trainees.

Limitations we found in using these tools included recruitment of supervising physicians to use them, difficulty finding time to engage in just-in-time assessments, and limitations in the validity of the scores as described above. Recruiting supervising physicians may be difficult initially, especially when introducing the new tools. However, we found that discussing them at various administrative meetings was helpful in encouraging their use. It can be difficult to find time to engage in just-in-time formative assessment, especially in a busy emergency department; thus, it may be useful to utilize the tools for formative assessment during an emergency department orientation.

A consistent theme of the feedback we received from evaluators was that they desired more training in tool use. When we tested the tools, we did not formally train our trainers, as we hoped the tools were intuitive enough to use in a stand-alone way. It is unclear if training evaluators would result in greater concordance in scores across raters. Nevertheless, to address these concerns, we developed a training video for the global rating scale. We hope that orientation to the scale using this video will allow for more precise evaluation of proceduralists.

In summary, the present tools have varying levels of validity across multiple domains. We believe the tools have a high level of content validity, as they were developed using established sources for procedural evaluation and technique. The tools were reviewed and subjected to modification based on the feedback of content experts. The tools showed less strength in other domains of validity. While learners understood the presented task (response process), there was only fair concordance between raters in scores on formal assessment. There were at least moderate correlations between levels of training and performance on both the global rating scale and checklist, although we did not find these differences to be statistically significant. For these reasons, we recommend the tools be used only for formative feedback, as a way to structure feedback to learners who wish to improve their procedural skills. For the global rating scale, scores were placed into descriptive tiers based on average scores per item (Table 3). If a trainee performs poorly, the skills to perform well can be learned via formative assessment using the tools as a guide.

Prior to the development of this resource, there existed no validated method to evaluate resident laceration repair performance in the emergency department. This resource represents the first iteration of a process to develop valid tools for summative evaluation of laceration repair techniques. Future attempts at tool refinement should look at more formal training for evaluators and at revising the individual items on both the global rating scale and the checklist to allow for more precise evaluation and more consistent scoring across evaluators. However, we believe this resource is a good starting point for more objective evaluation of laceration repair performance and can serve as a basis for directed formative feedback for trainees.

Author Information

  • Suzanne Seo, MD: Pediatric Emergency Medicine Fellow, Seattle Children's Hospital; Pediatric Emergency Medicine Fellow, University of Washington School of Medicine
  • Anita Thomas, MD, MPH: Assistant Professor, Department of Pediatrics, Division of Emergency Medicine, University of Washington School of Medicine; Assistant Professor, Department of Pediatrics, Division of Emergency Medicine, Seattle Children’s Hospital
  • Neil G. Uspal, MD: Associate Professor, Department of Pediatrics, Division of Emergency Medicine, University of Washington School of Medicine; Associate Professor, Department of Pediatrics, Division of Emergency Medicine, Seattle Children’s Hospital

The authors would like to thank Rebekah A. Burns, MD, Maya A. Jones, MD, Isabel T. Gross, MD, Ryan D. Kearney, MD, Rachel E. Whitney, MD, Julie Uspal, MD, Nancy Gove, PhD, and Jennifer R. Reid, MD, for their assistance in data collection and analysis. The authors would also like to acknowledge the Center for Leadership in Medical Education at the University of Washington for its support of this work through a Small Grant Program grant.

None to report.

None to report.

Prior Presentations
Uspal N, Thomas A, Burns R, et al. The validity of a global ratings scale and checklist for evaluation of pediatric laceration repair. Poster presented at: 9th International Pediatric Simulation Symposia and Workshops; June 1-3, 2017; Boston, MA.

Ethical Approval
Seattle Children’s Institutional Review Board approved this study.


  1. Beeson MS, Vozenilek JA. Specialty milestones and the Next Accreditation System: an opportunity for the simulation community. Simul Healthc. 2014;9(3):184-191. https://doi.org/10.1097/SIH.0000000000000006
  2. Accreditation Council for Graduate Medical Education. ACGME Program Requirements for Graduate Medical Education in Pediatrics. Accreditation Council for Graduate Medical Education website. https://www.acgme.org/Portals/0/PFAssets/ProgramRequirements/320pediatrics2016.pdf. Published September 30, 2012. Revised July 1, 2016. Accessed May 17, 2017.
  3. Ben-Isaac E, Keefer M, Thompson M, Wang VJ. Assessing the utility of procedural training for pediatrics residents in general pediatric practice. J Grad Med Educ. 2013;5(1):88-92. https://doi.org/10.4300/JGME-D-11-00255.1
  4. Barsuk JH, Cohen ER, Feinglass J, McGaghie WC, Wayne DB. Residents’ procedural experience does not ensure competence: a research synthesis. J Grad Med Educ. 2017;9(2):201-208. https://doi.org/10.4300/JGME-D-16-00426.1
  5. Aggarwal R, Darzi A. Technical-skills training in the 21st century. N Engl J Med. 2006;355(25):2695-2696. https://doi.org/10.1056/NEJMe068179
  6. Gaies MG, Landrigan CP, Halfer JP, Sandora TJ. Assessing procedural skills training in pediatric residency programs. Pediatrics. 2007;120(4):715-722. https://doi.org/10.1542/peds.2007-0325
  7. Iyer MS, Santen SA, Nypaver M, et al. Assessing the validity evidence of an objective structured assessment tool of technical skills for neonatal lumbar punctures. Acad Emerg Med. 2013;20(3):321-324. https://doi.org/10.1111/acem.12093
  8. Goff BA, Nielsen PE, Lentz GM, et al. Surgical skills assessment: a blinded examination of obstetrics and gynecology residents. Am J Obstet Gynecol. 2002;186(4):613-617. https://doi.org/10.1067/mob.2002.122145
  9. Siddiqui NY, Stepp KJ, Lasch SJ, Mangel JM, Wu JM. Objective structured assessment of technical skills for repair of fourth-degree perineal lacerations. Am J Obstet Gynecol. 2008;199(6):676.e1-676.e6. https://doi.org/10.1016/j.ajog.2008.07.054
  10. Gerard JM, Kessler DO, Braun C, Mehta R, Scalzo AJ, Auerbach M. Validation of global rating scale and checklist instruments for the infant lumbar puncture procedure. Simul Healthc. 2013;8(3):148-154. https://doi.org/10.1097/SIH.0b013e3182802d34
  11. Reznick R, Regehr G, MacRae H, Martin J, McCulloch W. Testing technical skill via an innovative “bench station” examination. Am J Surg. 1997;173(3):226-230. https://doi.org/10.1016/S0002-9610(97)89597-9
  12. van Hove PD, Tuijthof GJM, Verdaasdonk EGG, Stassen LPS, Dankelman J. Objective assessment of technical surgical skills. Br J Surg. 2010;97(7):972-987. https://doi.org/10.1002/bjs.7115
  13. Chipman JG, Schmitz CC. Using objective structured assessment of technical skills to evaluate a basic skills simulation curriculum for first-year surgical residents. J Am Coll Surg. 2009;209(3);364-370. https://doi.org/10.1016/j.jamcollsurg.2009.05.005
  14. Sall D, Wigger GW, Kinnear B, Kelleher M, Warm E, O’Toole JK. Paracentesis simulation: a comprehensive approach to procedural education. MedEdPORTAL. 2018;14:10747. https://doi.org/10.15766/mep_2374-8265.10747
  15. Lypson M, Buckler S, Poszywak K. Aseptic technique. MedEdPORTAL. 2015;11:10237. https://doi.org/10.15766/mep_2374-8265.10237
  16. Messina F, Wilbur L, Bartkus E, Cooper D, Huffman G. A fresh frozen cadaver procedure laboratory. MedEdPORTAL. 2008;4:794. https://doi.org/10.15766/mep_2374-8265.794
  17. Cousar J, Bohanske M, Hill J. Transvenous cardiac pacemaker educational resource. MedEdPORTAL. 2015;11:10107. https://doi.org/10.15766/mep_2374-8265.10107
  18. Sullivan M, Sullivan M, Baker C, Talving P, Inaba K. A cognitive-task-analysis informed central venous catheter placement curriculum. MedEdPORTAL. 2012;8:9135. https://doi.org/10.15766/mep_2374-8265.9135
  19. Freeman M, Wathen P, Williams J, Zhang M. Teaching incision and drainage of abscess. MedEdPORTAL. 2014;10:9736. https://doi.org/10.15766/mep_2374-8265.9736
  20. Sheets L, Bretl D. The SA pocket tool. MedEdPORTAL. 2012;8:9165. https://doi.org/10.15766/mep_2374-8265.9165
  21. Auerbach M, Chang T, Fein D, et al. A comprehensive infant lumber puncture novice procedural skills training package: an INSPIRE simulation-based procedural skills training package. MedEdPORTAL. 2014;10:9724. https://doi.org/10.15766/mep_2374-8265.9724
  22. Keim Janssen S, VanderMeulen SP, Brown D. Hands-on lightly embalmed cadaver lab for teaching knee aspiration/injection. MedEdPORTAL. 2012;8:9187. https://doi.org/10.15766/mep_2374-8265.9187
  23. Judy K, Webster K. A checklist to document sedation competency. MedEdPORTAL. 2008;4:754. https://doi.org/10.15766/mep_2374-8265.754
  24. Sawyer T, Creamer K, Puntel R, et al. Pediatric procedural skills training curriculum. MedEdPORTAL. 2010;6:8094. https://doi.org/10.15766/mep_2374-8265.8094
  25. Kobayashi L, Overly F, Gosbee J. Emergency department procedural sedation simulation package (SLIPSTREAM program scenarios A+B). MedEdPORTAL. 2012;8:8220. https://doi.org/10.15766/mep_2374-8265.8220
  26. Butler-O’Hara M, Reininger A, Dadiz R. Training in placement of peripherally inserted central catheters in the neonate. MedEdPORTAL. 2014;10:9780. https://doi.org/10.15766/mep_2374-8265.9780
  27. Acton R, Schmitz C, Chipman J, et al. University of Minnesota surgical clerkship simulation skills curriculum and instructor guide. MedEdPORTAL. 2010;6:7948. https://doi.org/10.15766/mep_2374-8265.7948
  28. Thomsen TW, Barclay DA, Setnik GS. Basic laceration repair [video]. N Engl J Med. 2006;355(17):e18. https://doi.org/10.1056/NEJMvcm064238
  29. Cimpello LB, Deutsch RJ, Dixon C, et al. Illustrated techniques of pediatric emergency procedures. In: Fleisher GR, Ludwig S, eds. Textbook of Pediatric Emergency Medicine. 6th ed. Philadelphia, PA: Lippincott Williams & Wilkins; 2010:1744-1842.
  30. McNamara R, DeAngelis M. Laceration repair with sutures, staples, and wound closure tape. In: King C, Henretig FM, eds. Textbook of Pediatric Emergency Procedures. 2nd Ed. Philadelphia, PA: Lippincott Williams & Wilkins; 2008:1018-1044.


Seo S, Thomas A, Uspal NG. A global rating scale and checklist instrument for pediatric laceration repair. MedEdPORTAL. 2019;15:10806. https://doi.org/10.15766/mep_2374-8265.10806

Received: August 15, 2018

Accepted: January 16, 2019