Original Publication
Open Access

An Introductory, Computer-Based Learning Module for Interpreting Noncontrast Head Computed Tomography

Published: June 1, 2018 | 10.15766/mep_2374-8265.10721

Appendices

  • Head CT Module.pptx
  • Test Version 1.docx
  • Test Version 1 Answer Key.docx
  • Test Version 2.docx
  • Test Version 2 Answer Key.docx
  • Survey.docx

All appendices are peer reviewed as integral parts of the Original Publication.

To view all publication components, extract (i.e., unzip) them from the downloaded .zip file.


Abstract

Introduction: New radiology and other residents must quickly assimilate a vast amount of anatomic and pathologic information when learning to interpret noncontrast head computed tomography (CT). No interactive, computer-based module using a search-pattern approach to provide new residents with the groundwork for interpretation of noncontrast head CT previously existed. Methods: We developed such a learning module using PowerPoint. First-year radiology residents completed the module prior to their neuroradiology rotation, and neurology residents completed it during orientation. Residents took 20-question pre- and posttests to assess knowledge and a postmodule survey. Each resident was randomized to one of two pretests and took the opposite as the posttest. Scores were collected over 5 years for radiology residents and 4 years for neurology residents. Statistical analysis of scores was performed using t tests. Results: Forty-seven first-year radiology residents and 31 neurology residents completed the module and the pre- and posttests. Scores for all residents either stayed the same or increased, regardless of the order of the versions of the pre- or posttests; the mean score increase was 4 (p < .0001) out of 20. Radiology residents had higher mean scores than neurology residents on the pre- and posttests, which were statistically significant (p < .04 and .0004, respectively). Feedback on the survey was overwhelmingly positive. Discussion: This computerized learning module is effective for teaching basic interpretation skills to new radiology and neurology residents. The module allows for asynchronous, programmed learning and the use of a step-by-step search-pattern approach.


Educational Objectives

By the end of this activity, learners will be able to:

  1. Utilize a search pattern when interpreting noncontrast head computed tomography (CT).
  2. Identify normal anatomy on a noncontrast head CT.
  3. Recognize common emergent pathology on a noncontrast head CT.

Introduction

Most graduates of U.S. medical schools tend to have limited background in image interpretation.1 This is particularly troublesome for new radiology residents, who transition from undergraduate to postgraduate training with little knowledge of their field, are immediately thrust into their jobs in the reading room, and within a year will be taking independent call and interpreting complicated cross-sectional imaging studies. On the other hand, while residents in other fields do not have the same level of focus on imaging, they will eventually be met with the challenge of reviewing imaging studies for their patients. Similar to radiology residents, they must build their skills from the ground up with limited time and often no formal training. These skills will be measured during residency in both areas by clinical competency committees, which assess every resident in predetermined milestone domains, at least twice a year.2

It is not surprising that those with less formal training in image interpretation do not perform as well as those with more experience. Numerous studies have evaluated the discrepancy rates between radiology residents and faculty, and the rates are significantly higher in first-year (postgraduate year two, or PGY-2) radiology residents than more senior residents.3-6 Similarly, studies have shown that other clinicians (e.g., emergency room physicians and neurosurgical trainees) have a high discrepancy rate when compared with radiologists’ reports.7-9 Some studies have shown that these discrepancies are reduced after completing formalized tutorials.10,11 Along these lines, some groups have attempted to implement more formalized introductory training for their radiology or other residents. One study showed that an introductory curriculum is helpful in preparing new radiology residents to take call.12 Another group organized orientation lecture sessions for first-year radiology residents.13 Another study showed that a checklist approach is useful for otolaryngology residents who are learning to interpret sinus computed tomography (CT) studies.14

At our institution, radiology residents begin previewing radiologic studies and dictating radiologic reports within the first week of beginning their radiology training, and thus, residents rely on their own background knowledge or on self-directed reading to prepare for the beginning of each rotation. Residents have often requested short, introductory learning resources that could be performed independently and immediately before their various rotations. These learning tools could provide the residents with a foundation in radiologic anatomy and pathology, as well as a search-pattern approach, with which to approach their work from the first day. Similar computerized modules used to teach radiologic anatomy and pathology have been previously published in MedEdPORTAL.15-17 We sought to build a computerized module for the neuroradiology rotation, which entails particularly complex anatomy as well as advanced imaging modalities, such as CT, even on the first day of training.

Our module is a stand-alone, image-rich, interactive learning module for interpreting emergent noncontrast head CTs for radiology residents and residents in other fields in which head CTs are common (neurology, neurosurgery, emergency medicine, etc.). This module is designed to give residents a basic search pattern for interpreting head CT and to introduce them to common anatomy and pathology. The module uses programmed learning so that the resident is forced to answer questions before advancing through the slides, as this has been shown to increase retention of material.18 In addition, the module can be used asynchronously, so that residents may view the module immediately before their own rotations begin. To our knowledge, this type of module has not been previously published.

Methods

Our learning module (Appendix A) was built in Microsoft PowerPoint by two of the coauthors with the input of two neuroradiologists, who helped provide the search pattern and identify the anatomy and pathology to be included. Material was targeted for radiology residents who would be interpreting emergent noncontrast head CTs. Images were taken from personal collections and were deidentified.

The module was instituted as a mandatory task for first-year radiology residents at our institution. There was no prerequisite knowledge. Our first-year radiology residents have 1-month rotations on neuroradiology, and we reserved the first morning (8:00 a.m.-12:00 p.m.) of the first day of the neuroradiology rotation for the new resident to complete the module without any other clinical duties (although most residents reported that the module took only 2 to 2.5 hours). We also made the module available to residents in other departments. The neurology and child neurology residency programs at our institution used this module for their incoming neurology and child neurology (PGY-2) residents as part of a 1-month, nonclinical, boot-camp rotation immediately preceding their first year of dedicated neurology training. We refer to this group as neurology residents from here forward.

As part of our assessment of the module and for the purpose of resident self-assessment, residents in this study each took a 20-question multiple-choice test before the module and then a second, different 20-question multiple-choice test after the module (Appendices B and C are the first test version and answer key, respectively, and Appendices D and E are the second test version and answer key, respectively). Each resident was randomized to take either version 1 or version 2 as the pretest and then complete the other version as the posttest. We wrote the test questions, taken directly from material in the module, and reviewed them with two neuroradiologists. We also recorded raw scores for each resident for research purposes. These scores were not used for milestone assessment, evaluation, or promotion.

We assigned the residents anonymized identification numbers with which to answer the pre- and postmodule tests. A facilitator assigned identification numbers, sent the file location of the module to the participating residents, and sent reminder emails to the residents and faculty that the resident was excused from clinical duties on the first morning of his or her rotation.

A voluntary survey (Appendix F) was given to radiology and neurology residents who completed this module; the survey was taken after the posttest. The survey contained several questions about the module and asked respondents to rate their experience on a 5-point scale (1 = strongly disagree, 5 = strongly agree). In addition, neurology residents were asked by their program to rate each of the educational activities they attended during the neurology boot-camp rotation on a 6-point scale (1 = very poor, 6 = excellent). Using the same 6-point scale, neurology residents also took a self-assessment survey for their program about knowledge and skills in different domains, including image interpretation, before and after the boot-camp rotation.

After 5 years of utilizing the module and pre- and posttests in radiology and 4 years in neurology, we collated score data for the pre- and posttests for both groups and the survey data. Statistical analysis was performed using t tests, and power calculations were performed.

Results

Forty-seven radiology residents (over 5 years) and 31 neurology residents (over 4 years) successfully completed the pre- and posttests.

For the entire group, including both radiology and neurology residents, every resident’s score on the posttest either stayed the same or increased compared to the pretest, regardless of the order in which residents took version 1 or version 2. The mean increase in score on the posttest was 4.0, which was statistically significant (p < .0001). The cohort size was sufficient to detect a 4.00-unit increase in score (alpha = 0.0001, beta = 0.0001). The median score increase was 4.0, and the range was an increase of 0.0-9.0.

Radiology residents had slightly better scores on the pretest and the posttest than neurology residents: Radiology residents had a mean score of 14.5 on the pretest and 18.6 on the posttest, while neurology residents had a mean score of 13.2 on the pretest and 17.0 on the posttest. The pre- and posttest differences between radiology and neurology residents were statistically significant (p < .04 for the pretest, p < .0004 for the posttest). For the pretest, the sample size was sufficient to detect a 1.28-unit difference in posttest scores between radiology and neurology (alpha = 0.2, beta = 0.2); for the posttest, the sample size was sufficient to detect a 1.53-unit difference in posttest scores between radiology and neurology (alpha = 0.05, beta = 0.05). The mean increase in scores for radiology and neurology residents, respectively, was 4.1 and 3.8, but the difference was not statistically significant.

On the survey given to radiology and neurology residents, the mean ratings for the four questions, respectively, were 4.8, 4.8, 4.9, and 4.9 out of 5 (5 = strongly agree), and the median was 5 for all questions.

On the survey of neurology residents regarding their boot-camp activities, the mean rating for the head CT module was 5.2, while the mean rating of the other 51 educational sessions was 4.0. Residents were also given the opportunity to write free-text comments. One wrote, “The CT workshop was one of the most high-yield topics. I was very uncomfortable with radiology prior to this session and felt it helped a lot.” Another wrote, “CT module was great and very helpful in getting [the] basics down.” No suggestions for improvement were noted.

On the survey of self-assessment of skills for neurology residents, the mean score in neuroimaging before the rotation was 2.5. The mean self-assessment rating of skills in neuroimaging after the rotation was 3.4.

During the evaluation period that most closely followed the neurology boot camp, all 23 adult neurology and child neurology residents who participated in this activity between 2014 and 2016 achieved a level 2 or higher in the neuroimaging domain. This means that their respective clinical competency committees felt that the residents were able to identify basic neuroanatomy and recognize emergent imaging findings on brain CT. These data were not available for radiology residents.

Discussion

Our project aimed to create an interactive, computer-based module to teach the learner the basic principles of noncontrast head CT interpretation. Our results show that the learning module increased the residents’ knowledge in almost all cases, with no resident scoring lower on the posttest than on the pretest. The module thus achieved its purpose of providing a useful introductory resource and has become a requirement in our program for radiology residents prior to starting their neuroradiology rotation. The module is generalizable to all radiology residency programs seeking an interactive learning resource for new radiology residents and furthermore can provide similar foundational imaging knowledge to residents in related fields (neurology, neurosurgery, emergency medicine, etc.).

By analyzing test results from the module, we learned that radiology residents had slightly, but significantly, higher scores than neurology residents on both the pre- and posttests. This might be partly due to radiology residents having increased prior exposure to and interest in image interpretation, either during medical school or during their internships. Also, because the neurology residents completed their module and tests at the very beginning of their PGY-2 year but the radiology residents completed them as they started their neuroradiology rotations throughout the year, many of the radiology residents had more imaging experience by the time they completed the module and tests.

Similarly, we have learned the value of having an asynchronous learning resource, such that radiology residents are able to use the module immediately before beginning their neuroradiology rotation, whenever that happens to occur during the year. This prevents residents from seeing the module during their first month and then forgetting the information should they happen to rotate in neuroradiology several (or many) months later.

We found that giving the residents dedicated time off their clinical service to complete the module encouraged participation and was a way to ensure a certain foundation of knowledge before beginning clinical work. Having a facilitator (in our case, an upperclassman resident) helped to ensure that the module was completed and that faculty were aware the resident would be absent from morning clinical duties.

This project also showed how useful this resource can be in related medical specialties, where residents usually have limited or no education in imaging. Surveys of neurology residents who participated in this activity suggest that they viewed it favorably compared to other educational sessions and that their self-assessed skills in neuroimaging improved following these sessions. All neurology residents who participated in this program achieved level-appropriate milestones in neuroimaging several months after completing the activity. One challenge in creating this module was that it was difficult to find faculty consensus on an appropriate search pattern for reviewing head CT images. We expected the most senior radiologists to have a detailed search pattern but instead found that they tended to have a relatively simple search pattern or, in some cases, no search pattern at all. We speculate that with more experience, radiologists abandon their search pattern and use more of a gestalt approach.

There are several limitations of the module. First, the module cannot possibly teach the entire scope of noncontrast head CT. A future direction may be to broaden the scope of the content to include the interpretation of postoperative head CT. The search-pattern approach is taken from certain individuals at our institution, making it somewhat subjective. There are certainly other useful search patterns that are not discussed. Currently, we have no particular threshold for passing or failing the pre- or posttests; therefore, the scores are not currently utilized for resident feedback. In addition, we do not currently monitor the residents while either viewing the module or taking the tests, so we have no true way of knowing how much of the module they actually complete. A future direction might be mapping the test questions to particular slides in the module, so that residents could immediately review material they missed on the test.

Our interactive, computer-based learning module for noncontrast head CT has proved useful in teaching new radiology and neurology residents about the basics of image interpretation. The module allows for asynchronous education and has been a useful tool for providing a search pattern and teaching basic imaging anatomy and pathology.


Author Information

  • Kara Gaetke-Udager, MD: Assistant Professor, Department of Radiology, University of Michigan Medical School; Residency Program Director, Department of Radiology, University of Michigan Medical School
  • Zachary N London, MD: Associate Professor, Department of Neurology, University of Michigan Medical School; Residency Program Director, Department of Neurology, University of Michigan Medical School
  • Sean Woolen, MD: Radiology Resident, Department of Radiology, University of Michigan Medical School
  • Hemant Parmar, MD: Professor, Department of Radiology, University of Michigan Medical School
  • Janet E. Bailey, MD: Professor, Department of Radiology, University of Michigan Medical School; Associate Chair of Education, Department of Radiology, University of Michigan Medical School
  • Daniel C. Barr, MD: Radiologist, Veterans Administration Medical Center, Salisbury, NC; Assistant Chief of Imaging, Veterans Administration Medical Center, Salisbury, NC

Disclosures
None to report.

Funding/Support
Dr. Gaetke-Udager reports personal fees from Novodynamics, Inc., outside the submitted work.

Ethical Approval
This project received “Not Regulated” status from the University of Michigan Institutional Review Board.


References

  1. Prezzia C, Vorona G, Greenspan R. Fourth-year medical student opinions and basic knowledge regarding the field of radiology. Acad Radiol. 2013;20(3):272-283. https://doi.org/10.1016/j.acra.2012.10.004
  2. Nasca TJ, Philibert I, Brigham T, Flynn TC. The Next GME Accreditation System—rationale and benefits. N Engl J Med. 2012;366(11):1051-1056. https://doi.org/10.1056/NEJMsr1200117
  3. Bruni SG, Bartlett E, Yu E. Factors involved in discrepant preliminary radiology resident interpretations of neuroradiological imaging studies: a retrospective analysis. AJR Am J Roentgenol. 2012;198(6):1367-1374. https://doi.org/10.2214/AJR.11.7525
  4. Erly WK, Berger WG, Krupinski E, Seeger JF, Guisto JA. Radiology resident evaluation of head CT scan orders in the emergency department. AJNR Am J Neuroradiol. 2002;23(1):103-107.
  5. Walls J, Hunter N, Brasher PMA, Ho SGF. The DePICTORS Study: discrepancies in preliminary interpretation of CT scans between on-call residents and staff. Emerg Radiol. 2009;16(4):303-308. https://doi.org/10.1007/s10140-009-0795-9
  6. Wysoki MG, Nassar CJ, Koenigsberg RA, Novelline RA, Faro SH, Faerber EN. Head trauma: CT scan interpretation by radiology residents versus staff radiologists. Radiology. 1998;208(1):125-128. https://doi.org/10.1148/radiology.208.1.9646802
  7. Alfaro D, Levitt MA, English DK, Williams V, Eisenberg R. Accuracy of interpretation of cranial computed tomography scans in an emergency medicine residency program. Ann Emerg Med. 1995;25(2):169-174. https://doi.org/10.1016/S0196-0644(95)70319-5
  8. Boyle A, Staniciu D, Lewis S, et al. Can middle grade and consultant emergency physicians accurately interpret computed tomography scans performed for head trauma? Cross-sectional study. Emerg Med J. 2009;26(8):583-585. https://doi.org/10.1136/emj.2008.067074
  9. Mukerji N, Cahill J, Paluzzi A, Holliman D, Dambatta S, Kane PJ. Emergency head CT scans: can neurosurgical registrars be relied upon to interpret them? Br J Neurosurg. 2009;23(2):158-161. https://doi.org/10.1080/02688690902730723
  10. Jamal K, Mandel L, Jamal L, Gilani S. “Out of hours” adult CT head interpretation by senior emergency department staff following an intensive teaching session: a prospective blinded pilot study of 405 patients. Emerg Med J. 2014;31(6):467-470. https://doi.org/10.1136/emermed-2012-202005
  11. Levitt MA, Dawkins R, Williams V, Bullock S. Abbreviated educational session improves cranial computed tomography scan interpretations by emergency physicians. Ann Emerg Med. 1997;30(5):616-621. https://doi.org/10.1016/S0196-0644(97)70079-9
  12. Ganguli S, Camacho M, Yam C-S, Pedrosa I. Preparing first-year radiology residents and assessing their readiness for on-call responsibilities: results over 5 years. AJR Am J Roentgenol. 2009;192(2):539-544. https://doi.org/10.2214/AJR.08.1631
  13. Gaetke-Udager K, Maturen KE, Barr DC, Watcharotone K, Bailey JE. Benefits of a resident-run orientation for new radiology trainees. J Educ Eval Health Prof. 2015;12:24. https://doi.org/10.3352/jeehp.2015.12.24
  14. Yao CM, Fernandes VT, Palmer JN, Lee JM. Educational value of a preoperative CT sinus checklist: a resident’s perspective. J Surg Educ. 2013;70(5):585-587. https://doi.org/10.1016/j.jsurg.2013.02.009
  15. Phillips A, Thurber B, Teven C, Wortman J, Soneru A, Straus C. Self-guided study module for head and neck radiological anatomy. MedEdPORTAL. 2014;10:9891. https://doi.org/10.15766/mep_2374-8265.9891
  16. Lazarus M, Stanley A, Smith L, Brian P. Thoracic anatomy tutorial using an imaging platform. MedEdPORTAL. 2014;10:9828. https://doi.org/10.15766/mep_2374-8265.9828
  17. Klinkhachorn P, Dey R, Klinkhachorn P, et al. Radiological images of the abdomen and pelvis. MedEdPORTAL. 2009;5:1708. https://doi.org/10.15766/mep_2374-8265.1708
  18. Ross SE. Programmed instruction and medical education. JAMA. 1962;182(9):938-939. https://doi.org/10.1001/jama.1962.03050480044011


Citation

Gaetke-Udager K, London ZN, Woolen S, Parmar H, Bailey JE, Barr DC. An introductory, computer-based learning module for interpreting noncontrast head computed tomography. MedEdPORTAL. 2018;14:10721. https://doi.org/10.15766/mep_2374-8265.10721

Received: December 18, 2017

Accepted: May 8, 2018