Original Publication
Open Access

Development and Evaluation of a Web-Based Dermatology Teaching Tool for Preclinical Medical Students

Published: August 12, 2017 | 10.15766/mep_2374-8265.10619

Appendices

  • SLE Case Animation
  • BCC Case Animation
  • Psoriasis Case Animation
  • Erythema Multiforme Case Animation
  • Melanoma Case Animation
  • SLE Case Animation folder
  • BCC Case Animation folder
  • Psoriasis Case Animation folder
  • Erythema Multiforme Case Animation folder
  • Melanoma Case Animation folder
  • Module Posttest.docx
  • Postmodule Survey.docx
  • Case Animation Viewing Instructions.txt

All appendices are peer reviewed as integral parts of the Original Publication.

To view all publication components, extract (i.e., unzip) them from the downloaded .zip file.


Abstract

Introduction: There is growing interest in, and emphasis on, electronic teaching tools in medicine. Despite relevant testing on the United States Medical Licensing Examination (USMLE), American medical schools offer limited training in skin disorders. Teaching visual topics like dermatology in classroom formats is challenging. We hypothesized that an electronic module would enhance students’ dermatology competency. Methods: A self-directed, case-based module was created. To test its efficacy, 40 medical students were randomized to have module access (interventional group) or none (conventional group). Learning outcomes were compared using a multiple-choice exam, including questions relevant and irrelevant to the module. Outcomes included proportions of correctly answered module questions (module scores) and nonmodule questions (nonmodule scores). Difference scores were calculated: (module score) − (nonmodule score). Positive values indicated that knowledge of module questions surpassed that of nonmodule questions. If there were a training effect, the interventional group’s difference score should exceed that of the conventional group. Results: The interventional group scored significantly higher than did the conventional group on module questions—75% (interquartile range [IQR], 69-88) versus 50% (IQR, 38-63), p < .001—and nonmodule questions—85% (IQR, 69-92) versus 69% (IQR, 54-77), p = .02. The Hodges-Lehman median difference estimate of the training effect was 13.0 (95% confidence interval, 0.5-25.5). Discussion: This e-module is effective at enhancing students’ competency in dermatology while emphasizing detailed pathophysiology that prepares them for USMLE Step 1. A module-based curriculum may enhance learning in supplement to traditional teaching modalities.


Educational Objectives

By the end of this module, learners will be able to:

  1. Recognize the dermatologic and immunologic features of psoriasis, melanoma, basal cell carcinoma, erythema multiforme, and systemic lupus erythematosus (malar rash).
  2. Identify the most appropriate diagnostic approaches used in each disease.
  3. Describe the appropriate first-line management for each included condition.

Introduction

Medical education increasingly employs electronic learning, or e-learning. Each subsequent generation of medical students is more familiar with technology than the last. In a study of 48 medical schools across several countries, one group found that every school incorporated e-learning into its curriculum.1 The majority of these e-learning programs were voluntary components of the schools’ curricula and concerned internal medicine, emergency medicine, and anatomy topics most frequently.1 Despite these implemented methods at certain schools, e-learning also occurs in a less organized fashion in essentially every medical school program.

Educators strive to optimize learning by making it as efficient as possible for their students. Part of this strategy necessitates identifying students’ learning preferences. Some of the benefits of e-learning include its easy accessibility, self-assessment tools, and flexibility.2 Although the preclinical years of medical school make it possible to gather medical students in one place for didactic teaching, attending lectures may not be a preferred method of learning for all students. Creating easily accessible, online tools caters to students’ preferences, allowing them to pick and choose what works best for their learning style. E-learning platforms, in addition, allow students to tailor their learning to their own recognized strengths and weaknesses.3 In an online setting, learners can glide through material that they have already mastered and spend more time with material that they have yet to effectively learn.3 In one study, 17 of 20 OB/GYN resident physicians preferred a website teaching tool over a lecture on the topic of Pap smear management.3 In another study, first-year medical students using a virtual reality simulator of the paranasal sinuses mastered the anatomy better than did their peers, who used only textbooks to learn the anatomy.4 Another report revealed that a majority of medical students believed that introducing tablets into their curriculum would positively contribute to their education.5 Given its ease of use and access, e-learning is sometimes preferable to other methods of learning.

Up to 7% of office visits concern dermatologic issues, and primary care providers are often the first to be exposed to a patient’s skin findings.6,7 However, most medical schools offer less than 18 hours of dermatology teaching.6 One study found that less than 80% of responding students felt neutral or uncomfortable diagnosing and treating dermatologic disorders; the students scored only an average of 49.9% on a multiple-choice quiz that pertained to 15 common dermatologic diagnoses.7 Interestingly, the authors proposed that most of the errors their students made could have been remedied with only 1-2 hours of formal dermatology instruction.7 Gaps in dermatology teaching might be easy to fill if students are provided with an effective curricular intervention. Therefore, we developed an electronic tool that would improve the effectiveness of dermatology instruction. Our central hypothesis was that students’ competency in dermatology would be significantly greater when using the proposed e-module for studying, along with conventional approaches (such as books and lecture notes), versus using conventional approaches alone.

There are other e-tools that focus on dermatology-related content. For example, one video module covers skin cancer.8 While that video module is helpful for learning the pathophysiology of skin cancers, its main goal differs from ours—that module was targeted to a broader group of health care professionals, including nursing students and allied health professionals in addition to medical students. While the tool was successfully incorporated among first-year medical and dental students, its efficacy was not specifically tested, and its questions were created in a format unlike that of the United States Medical Licensing Examination (USMLE). The tool also did not include case presentations, as its teaching goals were different from ours. Other dermatology-related modules are available, but they address different needs, such as teaching how to take skin biopsies or providing standardized patient exercises that cover the dermatologic manifestations of tropical diseases.9,10

The purpose of this project was to build an electronic curriculum to specifically supplement visual learning in skin conditions for first- and second-year (preclinical) students. Our module stands out from the aforementioned ones because it focuses on educating students on the pathophysiology of dermatologic conditions in a way that prepares them for the higher-order thinking required on the USMLE Step 1.

Methods

In creating this module, one of our main goals was to help students prepare for dermatology topics frequently tested on the USMLE Step 1 and on the preclinical curriculum. We selected five conditions for inclusion in the educational module based on topics likely to be tested and commonly known as high yield. These included psoriasis, melanoma, basal cell carcinoma (BCC), erythema multiforme, and systemic lupus erythematosus (SLE, malar rash).

Each condition was built into a case. Cases consisted of a patient scenario, gross image of the skin manifestation, several multiple-choice questions, and a cartoon animation illustrating the disease’s pathophysiology. A multiple-choice test and a feedback survey were also administered to students following completion of the module. The module is laid out in the appendices as follows:

  • SLE Case (Appendix A): includes a clinical scenario about a patient presenting with malar rash and several characteristic features of SLE. Several multiple-choice questions follow, which address the diagnosis, pathophysiology, and treatment of SLE and malar rash. Viewers are prompted to view a cartoon animation of the pathophysiology of the malar rash (see Appendix F).
  • BCC Case (Appendix B): includes a clinical scenario about a patient presenting with a BCC lesion on his nose. The lesion is described with several characteristic findings in BCC, and a gross image is included. Several multiple-choice questions follow, which address the diagnosis, pathophysiology, and treatment of BCC. Viewers are prompted to view a cartoon animation of the pathophysiology of BCC, highlighting the characteristic palisading nuclei (see Appendix G).
  • Psoriasis Case (Appendix C): includes a clinical scenario about a patient presenting with psoriatic rashes on the lower extremities. There is a gross image of the lesion. Several multiple-choice questions follow, which address the diagnosis, pathophysiology, and treatment of psoriasis. Viewers are prompted to view a cartoon animation of the pathophysiology of the psoriatic skin changes (see Appendix H).
  • Erythema Multiforme Case (Appendix D): includes a clinical scenario about a patient presenting with erythema multiforme in the setting of likely herpes simplex virus coinfection. The lesion is described with its characteristic clinical features, and a gross image is included. Several multiple-choice questions follow, which address the diagnosis, pathophysiology, and treatment of erythema multiforme. Viewers are prompted to view a cartoon animation of the pathophysiology of erythema multiforme (see Appendix I).
  • Melanoma Case (Appendix E): includes a clinical scenario about a patient presenting with a skin lesion that has progressed into melanoma. The progression of the lesion is described, and a gross image is included. Several multiple-choice questions follow, which address the diagnosis, pathophysiology, and treatment of melanoma. Viewers are prompted to view a cartoon animation of the pathophysiology of this malignant change (see Appendix J).
  • SLE Case Animation (Appendix F): provides a standardized cartoon image of the epidermal and dermal layers. The animation leads the viewer through the most significant pathophysiologic steps involved in the malar rash, involving ultraviolet B radiation, resulting keratinocyte apoptosis, necrosis, and lysis, as well as autoimmune contributions.
  • BCC Case Animation (Appendix G): provides a standardized cartoon image of the epidermal and dermal layers. The animation leads the viewer through the most significant pathophysiologic steps involved in the development of BCC lesions, particularly involving the replication of basaloid cells in a palisading pattern, and subsequent tumor growth.
  • Psoriasis Case Animation (Appendix H): provides a standardized cartoon image of the epidermal and dermal layers. The animation leads the viewer through the most significant proinflammatory cytokines and pathways involved in psoriatic skin changes through epidermal hyperplasia.
  • Erythema Multiforme Case Animation (Appendix I): provides a standardized cartoon image of the epidermal and dermal layers. The animation leads the viewer through the pathophysiologic changes that occur in erythema multiforme in the context of herpes simplex virus infection, particularly the formation of target lesions on the skin.
  • Melanoma Case Animation (Appendix J): provides a standardized cartoon image of the epidermal and dermal layers. The animation leads the viewer through the disruption of the intracellular signaling pathway that results in the development of melanoma.
  • Module Posttest (Appendix K): a 29-question multiple-choice test covering the diagnosis and pathophysiology of common dermatologic and immunologic disorders. This test was administered to students after their completion of the module.
  • Postmodule Survey (Appendix L): survey assessing students’ perceptions of dermatology teaching in medical school, their preferences for study tools, and their experience with the module (for those in the interventional group).
  • Case Animation Viewing Instructions (Appendix M): text file containing instructions for viewing the animations (Appendices F-J) on a web browser.

Multiple-choice questions addressed the pathophysiology, diagnosis, and treatment. We obtained the majority of gross images and histologic slides from publicly accessible sites on the internet or from the Virtual Microscopy Lab at our institution. We created animations detailing the pathophysiology of each disease in Adobe Edge Animate using images from a cell glossary, which ensured standardization between cases. The complete cases were uploaded to our institution’s online educational platform, eMED, where students also have access to their required course material. Students and board-certified physicians in the Departments of Allergy & Immunology, Dermatology, Rheumatology, Infectious Disease, and Oncology wrote and edited this work. The study was approved by our Institutional Review Board.

We tested the module’s efficacy on volunteer second-year medical students who were recruited via email in March 2014. The study took place approximately 3 months before these second-year students sat for the USMLE Step 1. Forty students volunteered and were enrolled in the study. The students had received no formal dermatology instruction at their medical school prior to this study. However, we recognize that students may have had exposure to dermatology materials, in addition to other medical information, during their private study time. These exposures were not specifically explored. However, in the posttest survey, 89% of participating students reported only 0-2 hours of total dermatology studying when asked, “Approximately how many hours did you spend on studying dermatology topics?” All participants’ only experience with e-learning in medical school was through an experimental, self-directed online curriculum covering gastrointestinal physiology 1 month prior to this study. Students involved in the design and testing of the module were excluded from participation. There were no other exclusion criteria.

We randomly assigned the students to the interventional group or the conventional group (Figure). The interventional group had electronic access to the module for 24 hours a day during a 10-day study period. The conventional group was given a link to the exam. Students in both groups were instructed to complete the 40-minute, timed exam by the end of the study period. We permitted additional studying for the exam (beyond use of the module for those in the interventional group) in both groups.

Figure. Experimental design: Forty second-year medical
students were recruited and randomized into interventional
and conventional groups. Only those in the interventional
group had access to the module. All involved students  took
the same exam and postexam survey.

A 29-question multiple-choice exam tested the students’ knowledge of pathophysiology, diagnosis, and treatment of common dermatologic and immunologic disorders. It consisted of 16 questions related to the module material (module questions: 4-5, 7-8, 10-12, 15, 17-18, 21-25, 29) and 13 unrelated to the module cases (nonmodule questions: 1-3, 6, 9, 13-14, 16, 19-20, 26-28). The questions were written and edited by authors M. Scaperotti, E. Jerschow, and J. D. Nosanchuk to be formatted similarly to those found on USMLE Step 1 and were based on the dermatology material included in the official USMLE Content Outline. The questions were reviewed by several faculty members with relevant specialties (dermatology, rheumatology, oncology, immunology, allergy, and infectious diseases), and their relevance was confirmed by these faculty members. The questions were then tested for use in a student population by administering them to the following authors: N. Gil, I. Downs, A. Jeyakumar, A. Liu, and J. Chan. The questions are being validated in an ongoing manner with increased numbers of students taking the modules. We have not placed them onto tests within larger courses as this would expose them and reduce their import. After the exam, students completed a survey that assessed their perceptions of dermatology teaching in medical school, their preferences of study tools, and their experience with the module (for those in the interventional group). Once the study was completed, we made the module available to all volunteer students for the remainder of the academic year.

The outcomes analyzed included the module and nonmodule scores in each group, as well as the difference scores comparing the performance on the two question types. We recorded exam responses by group assignment, with no personal identifiers. Correct answers were tabulated as 1 and incorrect answers as 0. We calculated module and nonmodule scores for each individual as the proportion of correct answers out of total number of respective questions. To evaluate the module’s potential training effect, we also calculated a difference score based on the following formula: (module score) − (nonmodule score). The difference score measures how knowledge of dermatology after the module compares with background dermatology knowledge. A positive value indicates that knowledge of the module questions surpasses that of nonmodule questions. We hypothesized that if there were a training effect, then the interventional group would do better on the module questions relative to the nonmodule questions than would the conventional group, and thus, their difference score would be greater. Comparisons were made within the groups comparing the module to the nonmodule scores and between the groups comparing the difference scores. Since the data distributions did not fully meet normality assumptions, we report values as medians and interquartile ranges (IQRs). Wilcoxon paired rank sum tests were used for within-group analyses and Mann-Whitney U tests for between-group analyses. In order to estimate an effect size, we calculated a Hodges-Lehman median difference estimate and 95% confidence interval (95% CI) for the comparison between groups. A two-tailed alpha of .05 was used to denote statistical significance. We performed statistical analyses using SPSS Version 20.

We also compared the academic abilities of the two groups using their mean scores from the USMLE Step 1, which they took later in the same academic year. Deidentified scores were provided by the school’s administration after study completion.

Part of our motivation in this study was to create an e-learning platform that not only would serve students specifically with regard to dermatology teaching but also would act as a model that other educators could use to create their own e-learning tools for various medical (or even nonmedical) topics. Thus, our modules are predicated on three principles of adult learning: (1) Adult learning is self-directed, (2) adult learning is problem-centered, and, (3) the adult learner is interested in immediate applications.11

  1. Information is presented in both visual and verbal formats, taking different learner preferences into account. Students can learn through verbal explanations, through answering questions, and through the presentation of highly salient visual information. Further self-direction is promoted by the ability of students to review and interact with the cases as many times as they feel necessary to learn the information.
  2. The model is problem-centered. Each case is based on a common problem seen in the physician’s office, and the students are asked to reason through the given material to learn and make decisions about diagnosis and management.
  3. Students using the module will note its immediate application to USMLE Step 1. The question format mirrors medical licensing and board questions that they will eventually encounter. The module also has applications for clinical practice that, although not immediate, are meaningful to the students.

Our description of the development and implementation of our module might allow other educators to either use it as it stands or create their own learning platforms based upon the above-stated principles. Ideally, our tool (and similar ones) would be readily accessible to students as a supplement to their more structured learning environment.

Results

Of 186 students emailed, 40 volunteered to participate, with 21 students randomized to the interventional group and 19 to the conventional group. All enrolled students (100%) completed the study, including the exam and survey.

The interventional group scored significantly higher on the module-related questions than did the conventional group: 75% (IQR, 69-88) versus 50% (IQR, 38-63), p < .001. The interventional group also scored significantly higher on the nonmodule questions than did the conventional group: 85% (IQR, 69-92) versus 69% (IQR, 54-77), p = .02 (Table).

Table. Module and Nonmodule Question Scores by Group
Score Interventional Group Conventional Group pa
Module: % (IQR) 75 (69-88) 50 (38-63) <.001
Nonmodule: % (IQR) 85 (69-92) 69 (54-77) 0.02
pb 0.5 0.002
Abbreviation: IQR, interquartile range.
ap value for comparison of module and nonmodule scores across groups.
bp value for comparison of module versus nonmodule scores in each group.

We next compared the performance on module versus nonmodule questions within each group. There was no significant difference in performance on module questions versus nonmodule questions in the interventional group: 75% (IQR, 69-88) versus 85% (IQR, 69-92), respectively, p = .5. The conventional group had a significantly greater score on the nonmodule questions compared to the module questions: 69% (IQR, 54-77) on nonmodule versus 50% (IQR, 38-63) on module, p = .002.

To evaluate the potential training effect of the module, we compared the difference between module and nonmodule scores in each group by the subtracting the nonmodule score from the module score. The interventional group had a significantly higher median difference score than did the conventional group: −8.2% (IQR, −17.6 to 15.1) versus −17.3% (IQR, −31.7 to −2.4), respectively, p = .04. Although median scores in both groups were negative (because the module scores were lower than the nonmodule scores), the Hodges-Lehman median difference estimate was 13.0 (95% CI, 0.5-25.5), indicating a significant training effect of the module in the interventional group as compared to the conventional group.

To ensure that the students’ scholarly abilities were not significantly different between the two groups, we compared the USMLE Step 1 scores of the students in each group. There was no significant difference between the two groups with regard to their median USMLE Step 1 scores: The interventional group’s score was 243 (IQR, 231-251), while the conventional group’s was 239 (IQR, 225-252), p = .5.

A majority of the 40 students (92.8%) reported that they were not prepared for the dermatology material tested on USMLE Step 1 based on their required medical school curriculum. On average, the students in the interventional group spent up to 2 hours using the module. Many students commented that they found the module user friendly. Others offered constructive feedback. For instance, one student thought additional multiple-choice questions should be added to each case. Another found the module to be useful but thought a summary table with the module’s learning points would help to reinforce learning. Of those who were in the interventional group, 87.5% reported that the module enhanced their understanding of the mechanisms of skin conditions, while 62.5% felt more prepared for dermatology on their board exam after using the module. Approximately 92.8% of all participating students, regardless of their group, indicated that a web-based teaching tool would be useful in their preparations for USMLE Step 1. Furthermore, when asked which tools they used most frequently to learn dermatology topics, the students consistently ranked the internet and e-module higher than they did textbooks.

Discussion

In the past decade, digital tools have frequently supplemented conventional teaching methods.12 Blending e-learning with traditional tools creates a more cost-effective and efficient system than using traditional methods alone.13 Previous experience at our institution suggests that computer-based educational programs are useful, feasible, and well accepted by students.3,4 The majority of volunteers in our study, similarly, found the e-module to positively influence their learning.

E-learning not only is acceptable to students but also enhances their learning. The American Academy of Dermatology created an online Medical Student Core Curriculum in response to insufficient medical school dermatology coverage.14 In 2013, the group piloted this proposed curriculum on fourth-year students and found that every participating student demonstrated a significant improvement from pre- to postexaminations.14 Another group found that its school’s average USMLE scores increased after implementing tablet use in the curriculum.5

Our data also suggest that the innovative electronic dermatology module increased learning of the pertinent material. The interventional group did significantly better on both the module and nonmodule questions than did the conventional group. There are several possible reasons to explain these results. The teaching tool may have allowed the interventional group members to answer both types of questions more effectively than they would have had they not been exposed to the module. In addition, students’ prior exposure to the material, sleep hygiene, and confidence, for instance, may influence their perception of a question’s difficulty.15 It is also possible that despite randomization, the interventional group had higher-performing students than did the conventional group. While the overall USMLE Step 1 scores were not significantly different between the two groups, it is still possible that the interventional group was more adept at dermatology questions in general than was the conventional group. Interestingly, the mean score on the module questions in the conventional group was 50%, which is comparable to that found in a prior study of students’ performance on an exam of common dermatologic diagnoses.7 This finding suggests that despite different curricula across medical schools, the average medical student’s dermatology competency remains at only about 50%.

For both groups, the nonmodule scores were greater than the module scores, although this difference was not significant in the interventional group. This suggests that the nonmodule questions were easier, more familiar, or more susceptible to correct guessing than were the module questions. A comparison of the difference scores (module minus nonmodule) showed a significantly higher difference score in the interventional group than in the conventional group. In addition, the module effect-size estimate indicated a significant training effect of the module in the interventional group as compared to the conventional group. In the future, a larger study may help to clarify this finding.

This study has several limitations. It was conducted on a small group of students from one medical school, and the participation rate was less than 25%; therefore, our results may not be generalizable to other medical school populations. We did not address the students’ readiness for online learning prior to administering the module. Despite the presumed idea that millennial students are open and welcome to e-learning platforms, a presurvey may have allowed us to design the model in a way that could have better served its students. In addition, the included students were volunteers. It is possible that the study population included higher-performing or more motivated students compared to those that did not volunteer. However, this should not have differentially biased the results, given that the intervention was randomly assigned. Furthermore, our comparison of the two randomly assigned groups showed similar aptitude on a standardized exam. Another limitation is the use of a single exam to assess student learning that was validated on only a small sample of students familiar with dermatology material prior to the study. Ideally, we would use both a pre- and posttest to assess students’ knowledge before and after exposure to the e-module. However, we did not use a pre/post design. Instead, we used nonmodule questions to compare performances between the groups and remediate this limitation. In addition, we did not compare the utility of the e-module to other potential teaching tools, such as the distribution of paper handouts, prior to the test. However, based on student preference, as assessed by the final survey, electronic teaching tools are sought by undergraduate medical students. Finally, our results suggest that the module and nonmodule questions may not have been of comparable difficulty—the nonmodule questions appear to have been easier. This would have been a major weakness if we did not have a comparison group and had just module versus nonmodule questions. Since we did have a control group, the comparison of difference scores allowed us to overcome this limitation.

This study also has several strengths. It introduces an electronic learning tool that is easily accessible to students via the internet, in their choice of study location. The different pieces of our module were integrated into the students’ online learning platform. The module could similarly be adapted to different websites at other medical schools. A unique feature of the module is its inclusion of animations that illustrate the pathophysiology of each disease. Although it is intended for students in the preclinical years, it also serves to expose students to rare diseases that they may not encounter during their clinical years of training. Because our module is based on important principles of adult learning and allows students to approach the material flexibly, it is also versatile across medical school courses. Course directors at our medical school (including those in pharmacology, infectious diseases, immunology, and rheumatology) have begun incorporating the cases into their lectures and course materials. Therefore, students will use this module to aid in their studies across multiple disciplines. This project is still in progress and continues to garner support from the medical students and faculty. Development of e-learning modules and animations covering dermatologic aspects of other pathophysiologic processes is also ongoing.

Given the emphasis on and growing use of e-learning, new curricular developments ought to incorporate innovative electronic tools in a rigorously presented yet flexible manner to facilitate an efficient and effective gain of medical knowledge. Our proposed dermatology e-module was a useful, visual learning tool for undergraduate medical students at one institution. To our knowledge, this is the first study aimed at the development and implementation of a visual tool to enhance dermatology competency in medical students who are preparing for USMLE Step 1. We believe that further study may reveal this module can be used to supplement traditional medical teaching for future classes at our institution and other schools. If this proves to be true, not only will the approach enhance students’ knowledge of dermatology but similarly crafted modules may be useful for other medical disciplines.


Author Information

  • Moira Scaperotti, MD: Recent Graduate, Albert Einstein College of Medicine
  • Nelson Gil: Medical Student, Medical Scientist Training Program, Albert Einstein College of Medicine
  • Ian Downs: Senior Medical Student, Albert Einstein College of Medicine
  • Arthie Jeyakumar, MD: Recent Graduate, Albert Einstein College of Medicine
  • Andy Liu, MD: Recent Graduate, Albert Einstein College of Medicine
  • Jimmy Chan, MD: Recent Graduate, Albert Einstein College of Medicine
  • Joseph Bonner: Independent Communications Consultant in Higher Education
  • Mary S. Kelly, PhD: Associate Professor, Department of Psychiatry and Behavioral Sciences, Albert Einstein College of Medicine
  • Joshua D. Nosanchuk, MD: Professor, Departments of Internal Medicine and Microbiology & Immunology, Albert Einstein College of Medicine; Assistant Dean for Students, Albert Einstein College of Medicine
  • Hillel W. Cohen, DrPH: Professor, Department of Epidemiology & Population Health, Albert Einstein College of Medicine
  • Elina Jerschow, MD: Associate Professor, Department of Medicine (Allergy & Immunology), Albert Einstein College of Medicine

Acknowledgments
Moira Scaperotti and Nelson Gil contributed equally to this work.

The authors wish to thank Drs. Michael Fisher, Howard Maibach, Elena Peeva, and Howard Steinman for their feedback on the case presentations and cartoons.

Disclosures
Until 2016, Dr. Cohen was a co-executive editor of a nonprofit journal (American Journal of Hypertension) and received compensation for that editorial work, but the current publication is not at all related to the work of that journal.

Funding/Support
This study was funded by the Office of Medical Education’s Grants for Excellence in Medical Education at the Albert Einstein College of Medicine. This publication was also supported by Clinical and Translational Science Award grant number UL1 TR001073 from the National Center for Advancing Translational Sciences, a component of the National Institutes of Health.

Ethical Approval
This publication contains data obtained from human subjects and received ethical approval.

Disclaimer
The contents of this publication are solely the responsibility of the authors and do not necessarily represent the official views of the National Institutes of Health.


References

  1. Back DA, Behringer F, Harms T, Plener J, Sostmann K, Peters H. Survey of e-learning implementation and faculty support strategies in a cluster of mid-European medical schools. BMC Med Educ. 2015;15:145. https://doi.org/10.1186/s12909-015-0420-4
  2. Cook DA, Triola MM. What is the role of e-learning? Looking past the hype. Med Educ. 2014;48(9):930-937. https://doi.org/10.1111/medu.12484
  3. Banks E, Chudnoff S, Freda MC, Katz NT. An interactive computer program for teaching residents Pap smear classification, screening and management guidelines: a pilot study. J Reprod Med. 2007;52(11):995-1000.
  4. Solyar A, Cuellar H, Sadoughi B, Olson TR, Fried MP. Endoscopic Sinus Surgery Simulator as a teaching tool for anatomy education. Am J Surg. 2008;196(1):120-124. https://doi.org/10.1016/j.amjsurg.2007.06.026
  5. Robinson R. Spectrum of tablet computer use by medical students and residents at an academic medical center. PeerJ. 2015;3:e1133. https://doi.org/10.7717/peerj.1133
  6. Silva CS, Souza MB, Silva Filho RS, de Medeiros LM, Criado PR. E-learning program for medical students in dermatology. Clinics (Sao Paulo). 2011;66(4):619-622. https://doi.org/10.1590/S1807-59322011000400016
  7. Ulman CA, Binder SB, Borges NJ. Assessment of medical students’ proficiency in dermatology: are medical students adequately prepared to diagnose and treat common dermatologic conditions in the United States? J Educ Eval Health Prof. 2015;12:18. https://doi.org/10.3352/jeehp.2015.12.18
  8. Rana J, Mostaghimi A. Introduction to skin cancer: a video module. MedEdPORTAL Publications. 2016;12:10431. http://doi.org/10.15766/mep_2374-8265.10431
  9. Duran-Nelson A, Raymond J, Reihsen T. Dermatology procedural course for internal medicine residents-a didactic and practical simulation exercise. MedEdPORTAL Publications. 2012;8:9214. http://doi.org/10.15766/mep_2374-8265.9214
  10. Mankbadi M, Goyack L, Thiel B, Weinstein D, Simms-Cendan J, Hernandez C. Dermatologic simulation of neglected tropical diseases for medical professionals. MedEdPORTAL Publications. 2016;12:10525. https://doi.org/10.15766/mep_2374-8265.10525
  11. Merriam SB. Andragogy and self-directed learning: pillars of adult learning theory. N Dir Adult Contin Educ. 2001;(89):3-14. https://doi.org/10.1002/ace.3
  12. Ozuah PO. Undergraduate medical education: thoughts on future challenges. BMC Med Educ. 2002;2:8. https://doi.org/10.1186/1472-6920-2-8
  13. Ruiz JG, Mintzer MJ, Issenberg SB. Learning objects in medical education. Med Teach. 2006;28(7):599-605. https://doi.org/10.1080/01421590601039893
  14. Cipriano SD, Dybbro E, Boscardin CK, Shinkai K, Berger TG. Online learning in a dermatology clerkship: piloting the new American Academy of Dermatology Medical Student Core Curriculum. J Am Acad Dermatol. 2013;69(2):267-272. https://doi.org/10.1016/j.jaad.2013.04.025
  15. Durning SJ, Dong T, Artino AR, van der Vleuten C, Holmboe E, Schuwirth L. Dual processing theory and experts’ reasoning: exploring thinking on national multiple-choice questions. Perspect Med Educ. 2015;4(4):168-175. https://doi.org/10.1007/s40037-015-0196-6


Citation

Scaperotti M, Gil N, Downs I, et al. Development and evaluation of a web-based dermatology teaching tool for preclinical medical students. MedEdPORTAL. 2017;13:10619. https://doi.org/10.15766/mep_2374-8265.10619

Received: March 23, 2017

Accepted: July 1, 2017