Original Publication
Open Access

Standardized Checklist for Otoscopy Performance Evaluation: A Validation Study of a Tool to Assess Pediatric Otoscopy Skills

Published: August 5, 2016 | 10.15766/mep_2374-8265.10432

Appendices

  • SCOPE Checklist.docx
  • SCOPE Checklist & Instructions.docx

All appendices are peer reviewed as integral parts of the Original Publication.

To view all publication components, extract (i.e., unzip) them from the downloaded .zip file.


Abstract

Introduction: Acute otitis media (AOM) is the most frequently diagnosed pediatric illness in the United States and the most common indication for antibiotic prescription. Skill in pediatric otoscopy is essential to correctly identify children with AOM. However, studies have found diagnostic inconsistency and significant overdiagnosis among practitioners. Given the potential public and individual health consequences, there has been a call for improved education regarding the diagnostic certainty of AOM. Yet educational efforts continue to be limited, particularly in regard to competency assessment. The lack of a validated tool to assess otoscopy skill attainment objectively diminishes the instructor’s ability to provide useful feedback and direction to the learner. Methods: We have undertaken an educational intervention with the goal of developing a validated Standardized Checklist for Otoscopy Performance Evaluation (SCOPE), building on key principles of the general pediatric ear exam. The SCOPE was developed with the input of process and content experts with attention to specific domains of validity. Results: Our analysis provides important validity evidence for the SCOPE assessment tool. The instrument was piloted and successfully implemented with medical students and varying levels of residents in pediatrics and emergency medicine over a 5-year period in varied settings: urgent care, large and small pediatric clinics, and the emergency departments at two institutions. It has been used for both instruction and assessment purposes. Discussion: Because the SCOPE can be used in teaching demonstration purposes, in formative and summative assessment settings, and across the continuum of learners, this instrument offers the potential for more educational efforts in the field of assessment in direct patient care. We anticipate that the SCOPE will foster an environment of efficient yet high-yield review and discussion of otoscopy and diagnostic competency.


Educational Objectives

With appropriate use of the Standardized Checklist for Otoscopy Performance Evaluation, learners will be able to:

  1. Demonstrate a competent technique for the general ear exam for a pediatric patient.
  2. Demonstrate a competent technique for cerumen removal for a pediatric patient.
  3. Demonstrate a competent technique for pneumatic otoscopy for a pediatric patient.

Introduction

Acute otitis media (AOM) is the most frequently diagnosed pediatric illness in the United States and the most common indication for antibiotic prescription.1,2 Skill in pediatric otoscopy is essential to correctly identify children with AOM. However, studies have found diagnostic inconsistency among practitioners and significant overdiagnosis of AOM. This has resulted in an increased incidence of antimicrobial resistance and higher health care costs due to unnecessary antibiotic prescriptions and surgical referrals.3

Given the potential public and individual health consequences, there has been a call for improved education regarding the diagnostic certainty of AOM.4 Revised clinical guidelines from the American Academy of Pediatrics (AAP) specify that “educational and dissemination methods both at the practicing physician level and especially at the resident level need to be examined.”3 Furthermore, the AAP’s recommendations reinforce the importance of ongoing education with instruction beginning early on in medical school and continuing throughout postgraduate training.3,5 In response to this call for improved competency in the evaluation and diagnosis of AOM, otoscopy curricula are emerging in the literature. Yet educational efforts continue to be limited, particularly in regard to competency assessment.2,6-18

The lack of a validated tool to assess otoscopy skill attainment objectively diminishes the instructor’s ability to provide useful feedback and direction to the learner. We undertook an educational intervention with the goal of developing a validated Standardized Checklist for Otoscopy Performance Evaluation (SCOPE), building on key principles of the general pediatric ear exam. This assessment tool enables instructors to determine level of competency across a range of clinical and simulated learning environments covering the continuum of trainee levels and specialties.

The checklist was developed within the framework of Glassick and Kirkpatrick’s scholarship criteria, and it can be disseminated for other sound scholarship purposes.19,20 The checklist provides defined and clear goals for learner assessment. It has been tested in and can be used with appropriate scholarship in the setting of at least the first three tiers of Kirkpatrick’s levels of evaluation.20 Regarding its content, curriculum process experts contributed to the content itemization of the instrument checklist. The SCOPE is found in Appendix A. Also provided is a version of the checklist containing facilitator instructions (Appendix B).

Methods

An extensive search of the literature including pediatrics, otolaryngology, simulation, international, and ancillary (nursing) was performed and updated over the course of 2 years to determine sources of expert content for the basis of the checklist instrument. Two key resources containing expert content emerged from the review: Kaleida et al.’s “Mastering Diagnostic Skills: Enhancing Proficiency in Otitis Media, a Model for Diagnostic Skills Training,”10 and Shaikh, Hoberman, Kaleida, Ploof, and Paradise’s “Diagnosing Otitis Media—Otoscopy and Cerumen Removal.”11 From these resources, as well as from content experts in the field of pediatrics otoscopy, general pediatrics, and otolaryngology, points in the continuum of proficiency for the following three domains were identified and deliberated upon: (1) general approach to the pediatric ear exam, (2) cerumen removal, and (3) pneumatic otoscopy.

The first instrument subsection, on the general examination, was initially developed with a target audience of third-year pediatrics clerkship medical students. This subsection was developed to serve as the essential base to becoming proficient in pediatric otoscopy. The subsection should be used as the base for all levels of learners since proficiency and competency cannot be assumed.

The instrument was piloted for a period of 6 months over four clerkship rotations prior to actual implementation. The instrument was then utilized in various direct patient care settings with third-year medical students for a period of 2 years. For over 4 years, the instrument has been used in pediatric otoscopy student workshop sessions during demonstrations and as students practiced their otoscopy skills on both manikins and each other.

The second and third subsections of the instrument, on cerumen removal and pneumatic otoscopy, respectively, were developed as more advanced skills in the longitudinal learning of pediatric otoscopy. Again, the aforementioned content experts and expert process input were used to develop these two subsections of the checklist. Specifically for pneumatic otoscopy, a validated novel ear simulator was used for development of the checklist and then for teaching, evaluation, and validity purposes.21

General pediatricians from community settings and academic centers also contributed to the final phases of checklist development. The instrument was pilot tested over a 1-year period with intern and senior residents in urgent care and emergency department settings and in ambulatory general pediatric settings of various sizes and patient populations.

Implementation

  1. The intent of the checklist is for integration as the assessment component in the instructor’s current otoscopy skills curriculum. There is an assumed level of competency for the assessor. However, if assessors wish to gain additional competence in this area, we favor review of Shaikh et al.’s otoscopy videos11 and Kaleida et al.’s images10 as excellent resources for curriculum content. Additional resources may be used for this purpose if needed. Estimation of time to review this material depends on prior experience of the assessor and has ranged between 2 and 8 hours.
  2. The examinees should have undergone training in the pediatric ear exam and otoscopy as per the instructor’s/institution’s curriculum. The assessment may be performed at any level of the learner continuum. However, we recommend initial evaluation immediately following administration of the curriculum. Repeat assessments may be timed either upon completion of a clinical rotation (i.e., medical student pediatric clerkship) or prior to the subsequent postgraduate year or graduation.
  3. The assessment may be utilized in a variety of settings. We have utilized it to evaluate pediatric ear exam and otoscopy skills on simulated manikins, on OSCE patients, and during actual patient encounters. In all instances, as a direct observation tool, there are no significant additional resources required.
  4. The assessment tool requires approximately 5 to 10 minutes for direct observation of the learner and completion of the checklist. One evaluator is necessary for each trainee. To maintain maximal reliability during formative/summative evaluations, the same evaluator should be utilized for reassessments of a single trainee.
  5. The assessment instrument is completed via a combination of yes/no check boxes and brief fill-in-the-blank questions. The final component of the assessment requires the instructor to perform her/his own examination to confirm the diagnosis.
  6. The components of the assessment include (a) discussion with caregiver, (b) selection of equipment, (c) positioning of the patient, (d) distraction techniques, (e) otoscopy exam technique (including insertion and stabilization techniques, technique of cerumen removal if indicated, and pneumatic otoscopy), and (f) diagnosis.
  7. Review of the assessment results between instructor and trainee may be performed at an appropriately scheduled time. Performance is based on checklist findings and provides concrete feedback on areas of competency and skill deficiency. Deficient areas may be addressed through review of the instructor’s curriculum or provision of additional resources, such as Shaikh et al.’s otoscopy videos11 or other suitable materials.
  8. Examples of use: Videos with pediatric patient ear exams were developed to assess inter-and intrarater reliability for validity purposes (see the Results section). These videos are not needed to teach use of the checklist.

Results

Content Validity
Content experts, pediatric otoscopy experts, and general pediatricians from diverse clinical practices contributed to the development of the checklist, providing for evidence of content validity.10,11 The checklist focuses on the general examination, pneumatic otoscopy, and cerumen removal. Pertinent well-established and peer-reviewed principles of pediatric otoscopy were gleaned from the expert content sources and incorporated into the checklist’s final form.

The checklist was developed with facilitated feedback collected over a 2-year period from stakeholders such as faculty, clinicians, and learners. During this period, the checklist demonstrated that it matched the intended construct and also demonstrated feasibility.

The instrument was demonstrated to be feasible with various groups of learners: medical students (n = 83), pediatric interns at the beginning and end of their intern year across two institutions (n = 40), pediatric senior residents (n = 14), and emergency medicine residents (n = 12) over a 3- to 4-year period at two separate institutions. The instrument was also demonstrated to be feasible in a variety of clinical settings, including general pediatric clinics, pediatric urgent care clinics, and a pediatric emergency medicine department.

Furthermore, the instrument was found to be useful among a variety of general pediatric clinic preceptors across a number of different clinic sizes and settings (e.g., community vs. academic). Preceptors in two separate pediatrics institutions who worked with emergency medicine faculty also found this useful. An institutional review board–approved survey of the general pediatrics preceptors demonstrated feasibility of the instrument with their students. Of surveyed faculty, 100% reported that the instrument did not interfere with patient care or with teaching their students, 88% reported ease of checklist implementation in their teaching clinics, and 75% reported that the instrument improved their direct observation of their students.22

Internal Structure
Reproducibility of the checklist: The checklist was successfully implemented in different clinical settings, such as general pediatric clinics, pediatric urgent care clinics, and a pediatric emergency medicine department, over a 2-year period. Clinics varied in number of preceptors on site (ranging between two and eight), setting (community vs. academic), and patient population (socioeconomic status, ethnicity, etc.).

The scores on the general examination subsection of the checklist significantly and appropriately increased with advancing levels of learners (see Table 1 & Table 2). The scores on the general examination, pneumatic otoscopy, and cerumen removal subsections also significantly increased between the first and second years for both pediatric and emergency medicine residents (see Table 3).23

Table 1. General Examination Scores by Learner Group as
Percentages of the Maximum Score
Learner Groupa Score M (SD)
Medical student (n = 83) 45.9% (17.8%)
Preinternship year (n = 10) 40.4% (17.8%)
Postinternship year (n = 9) 75.9% (17.4%)
Postpediatric level 2 year (n = 9) 88.9% (11.8%)
aSamples include carefully tracked cohorts with similar exposure.
Table 2. Comparison of General Examination Scores Between Learner Groups 
Comparison of Learner Groups Effect Sizea pb
Preinternship year vs. postinternship year 1.7 <.001
Postinternship year vs. postpediatric level 2 year 0.9 0.002
Medical student vs. postinternship year 1.7 <.001
aCohen’s d test.
bStudent’s t test.

Reliability of the checklist: Ten faculty preceptors utilized the checklist to assess standardized ear exam skills with real pediatric patients in videos specifically developed to assess the accuracy of their assessments compared with the correct answers, which were developed a priori. Preceptors were asked to assess the examiner’s otoscopy skills according to the checklist. Scores were analyzed for accuracy of assessment. The checklist demonstrated high accuracy, with faculty preceptors achieving 95%-97% correct responses when using the checklist to assess the standardized ear exams.24

 Table 3. Subsection Scores as Percentages of Total Correct Compared
Across First and Second Years of Each Residency
Subsection and Residency Group Intern Year 1 Score M (SD) Intern Year 2 Score M (SD) pa
General exam
 Combined 41.2% (23.2%) 94.8% (7.7%) <.001
 Pediatrics 21.7% (6.1%) 98.3% (5.3%) <.001
 Emergency medicine 57.4% (19.2%) 91.7% (8.3%) <.001
Pneumatic otoscopy
 Combined 42.4% (33.9%) 79.4% (15.7%) <.001
 Pediatrics 35.6% (10.7%) 76.7% (14.1%) <.001
 Emergency medicine 48.1% (44.6%) 81.8% (17.4%) 0.025
Cerumen removal
 Combined 3.3% (14.1%) 83.6% (15.0%) <.001
 Pediatrics 2.0% (4.1%) 79.0% (14.5%) <.001
 Emergency medicine 4.4% (18.9%) 95.0% (10.0%) <.001
aStudent’s t test.

An intraclass correlation statistic was used to assess the interobserver reliability of the checklist between the five preceptors who evaluated students in the real-time patient setting using the standardized ear exams. Using standard criteria, the intraclass correlation of 0.8 indicated an excellent level of interobserver reliability of the checklist.23

Response Process
Faculty and learners can become familiar with the checklist’s content by assessor training (see the Methods section, Implementation subsection, above) and by various teaching venues for the learner. The checklist has been successfully demonstrated in teaching settings, such as in student labs and resident seminars at multiple institutions. A think-aloud approach was used during its development process among academic pediatricians from different clinical practices, pediatric otoscopy experts, and learners. Assessors have been shown to be consistent in how they assess their learners (see Reliability of the checklist subsection above).

Relationships to Other Variables
One weakness in the method of development of this instrument is the inability to fully assess its relationship to other variables. The appropriate and significant increase of scores with increasing levels of learners suggests some validity evidence for relationship to other variables. Further work is needed in this domain.

Consequences
The checklist was designed primarily for formative assessment, but it may be adapted for summative assessment. The construct is remediable, and the skills can be learned. The checklist facilitates feedback specific to content items, allowing for more effective feedback. Peer-reviewed resources may be consulted to aid in feedback.10,11 The checklist has been used as a teaching/demonstration/practice tool in curriculum studies that demonstrated gains in learners’ clinic skills.25

Discussion

The need to learn and become proficient at the pediatric ear exam has been critically linked to accurate diagnosis of AOM, judicious use of antibiotics, and control of health care costs, including subspecialty care.3 The literature is beginning to describe curricula for various learners, but there are no validated assessment instruments for pediatric otoscopy skills.

The SCOPE helps fill this gap regarding standardized assessment of pediatric otoscopy skills including cerumen removal and pneumatic otoscopy. That being said, use of the checklist in a clinical setting is not without certain limitations. Specifically, limitations inherent to checklists include the need to appropriately determine the relevant skills to be observed, potential recorder bias, and the need for clear observation of the skill by the instructor. Furthermore, the instructor must be familiar with the constructs of proper otoscopy skills and appropriately budget additional time to perform, review, and discuss results. Finally, parental agreement and cooperation are needed when the checklist is used in a clinical environment.

We have attempted to address these limitations through the validation of observed skills and diagnosis. To address instructor expertise, we offered standardized resources to improve the instructor’s own competence. The checklist was specifically created with time constraints in mind and is not overly detailed, resulting in very little additional time being needed to perform the assessment.

The SCOPE has undergone extensive critique with regard to content and other domains of validity as well as clinical relevance to varying learner levels (medical students, pediatric interns, pediatric senior residents, and emergency medicine residents) and varied clinical settings (small and large general pediatric clinics, a pediatric urgent care clinic, and a pediatric emergency room) in two medical institutions over a 5-year period. Because it can be used in teaching demonstration purposes, in formative and summative assessment settings, and across the continuum of learners, this instrument offers the potential for more educational efforts in the field of assessment in direct patient care. We anticipate that the SCOPE will foster an environment of efficient yet high-yield review and discussion of otoscopy and diagnostic competency.


Author Information

  • Caroline R. Paul, MD: Assistant Professor (CHS), Department of Pediatrics, University of Wisconsin School of Medicine and Public Health
  • Meg G. Keeley, MD: Professor, Department of Pediatrics, University of Virginia School of Medicine
  • Gregory Rebella, MD: Assistant Professor (Clinical), Department of Emergency Medicine, University of Wisconsin School of Medicine and Public Health
  • John G. Frohna, MD, MPH: Professor, Departments of Pediatrics and Medicine, University of Wisconsin School of Medicine and Public Health

Disclosures
None to report.

Funding/Support
None to report.

Ethical Approval
This publication contains data obtained from human subjects and received ethical approval.


References

  1. Hoberman A, Paradise JL, Rockette HE, et al. Treatment of acute otitis media in children under 2 years of age. N Engl J Med. 2011;364(2):105-115. http://dx.doi.org/10.1056/NEJMoa0912254
  2. Block SL. Improving the diagnosis of acute otitis media: “seeing is believing.” Pediatr Ann. 2013;42(12):485-490. http://dx.doi.org/10.3928/00904481-20131122-05
  3. Lieberthal AS, Carroll AE, Chonmaitree T, et al. The diagnosis and management of acute otitis media. Pediatrics. 2013;131(3):e964-e999. http://dx.doi.org/10.1542/peds.2012-3488
  4. Shaikh N, Hoberman A. Update: acute otitis media. Pediatr Ann. 2010;39(1):28-33. http://dx.doi.org/10.3928/00904481-20091222-03
  5. Pichichero ME, Poole MD. Comparison of performance by otolaryngologists, pediatricians, and general practitioners on an otoendoscopic diagnostic video examination. Int J Pediatr Otorhinolaryngol. 2005;69(3):361-366. http://dx.doi.org/10.1016/j.ijporl.2004.10.013
  6. Rosenfeld RM. Diagnostic certainty for acute otitis media. Int J Pediatr Otorhinolaryngol. 2002;64(2):89-95. http://dx.doi.org/10.1016/S0165-5876(02)00073-3
  7. Varrasso DA. Otitis media: the need for a new paradigm in medical education. Pediatrics. 2006;118(4):1731-1733. http://dx.doi.org/10.1542/peds.2005-2794
  8. Yudkowsky R, Park YS, Lineberry M, Knox A, Ritter EM. Setting mastery learning standards. Acad Med. 2015;90(11):1495-1500. http://dx.doi.org/10.1097/ACM.0000000000000887
  9. Pichichero ME. Diagnostic accuracy, tympanocentesis training performance, and antibiotic selection by pediatric residents in management of otitis media. Pediatrics. 2002;110(6):1064-1070.
  10. Kaleida PH, Ploof DL, Kurs-Lasky M, et al. Mastering diagnostic skills: Enhancing Proficiency in Otitis Media, a model for diagnostic skills training. Pediatrics. 2009;124(4):e714-e720. http://dx.doi.org/10.1542/peds.2008-2838
  11. Shaikh N, Hoberman A, Kaleida PH, Ploof DL, Paradise JL. Diagnosing otitis media—otoscopy and cerumen removal. N Engl J Med. 2010;362(20):e62. http://dx.doi.org/10.1056/NEJMvcm0904397
  12. Shaikh N, Hoberman A, Kaleida PH, et al. Otoscopic signs of otitis media. Pediatr Infect Dis J. 2011;30(10):822-826. http://dx.doi.org/10.1097/INF.0b013e31822e6637
  13. Rosenkranz S, Abbott P, Reath J, Gunasekera H, Hu W. Promoting diagnostic accuracy in general practitioner management of otitis media in children: findings from a multimodal, interactive workshop on tympanometry and pneumatic otoscopy. Qual Prim Care. 2012;20(4):275-285.
  14. Al-Khatib T, Fanous A, Al-Saab F, Sewitch M, Razack S, Nguyen LH. Pneumatic video-otoscopy teaching improves the diagnostic accuracy of otitis media with effusion: results of a randomized controlled trial. J Otolaryngol Head Neck Surg. 2010;39(6):631-634.
  15. Campisi P, Asaria J, Brown D. Undergraduate otolaryngology education in Canadian medical schools. Laryngoscope. 2008;118(11):1941-1950. http://dx.doi.org/10.1097/MLG.0b013e31818208e7
  16. Error ME, Wilson KF, Ward PD, Gale DC, Meier JD. Assessment of otolaryngic knowledge in primary care residents. Otolaryngol Head Neck Surg. 2013;148(3):420-424. http://dx.doi.org/10.1177/0194599812472314
  17. Glicksman JT, Brandt MG, Parr J, Fung K. Needs assessment of undergraduate education in otolaryngology among family medicine residents. J Otolaryngol Head Neck Surg. 2008;37(5):668-675.
  18. MacClements JE, Parchman M, Passmore C. Otitis media in children: use of diagnostic tools by family practice residents. Fam Med. 2002;34(8):598-603.
  19. Glassick CE, Huber MT, Maeroff GI. Scholarship Assessed: Evaluation of the Professoriate. San Francisco, CA: Jossey-Bass; 1997.
  20. Kirkpatrick D. Evaluation of training. In: Craig RL, ed. Training and Development Handbook: A Guide to Human Resource Development. 2nd ed. New York, NY: McGraw Hill; 1976.
  21. Morris E, Kesser BW, Peirce-Cottler S, Keeley M. Development and validation of a novel ear simulator to teach pneumatic otoscopy. Simul Healthc. 2012;7(1):22-26. http://dx.doi.org/10.1097/SIH.0b013e31822eac39
  22. Paul C, Gjerde C, McIntosh G. Preceptors’ perceptions of use of an evaluation tool for medical students in routine clinical encounters with real patients. In: Proceedings of the Annual Meeting of the Pediatric Academic Societies; April 30-May 3, 2011; Denver, CO. Abstract 4514.
  23. Paul C, Keeley M. Development of an otoscopy checklist with evidence of validity. In: Proceedings of the Annual Meeting of the Pediatric Academic Societies; April 25-28, 2015; San Diego, CA. Abstract 752257.
  24. Paul C, McIntosh G, Ellis R. Reliability of a checklist used for assessment of pediatric otoscopy skills. In: Proceedings of the Annual Meeting of the Council on Medical Student Education in Pediatrics; April 10-13, 2013; Nashville, TN. Abstract 1143.
  25. Keeley MG, Paul CR, Rebella GS. Competency-based, multimodal assessment of pediatric otoscopy skills: a multi-institutional, cross-specialty study. In: Proceedings of the Association of American Medical Colleges Medical Education Meeting; November 6-7, 2014; Chicago, IL.


Citation

Paul CR, Keeley MG, Rebella G, Frohna JG. Standardized Checklist for Otoscopy Performance Evaluation: a validation study of a tool to assess pediatric otoscopy skills. MedEdPORTAL. 2016;12:10432. https://doi.org/10.15766/mep_2374-8265.10432

Received: February 8, 2016

Accepted: July 5, 2016