Introduction: While evidence-based medicine (EBM) is an Accreditation Council for Graduate Medical Education core competency, EBM teaching in pediatric subspecialties is rarely reported. Therefore, we designed, implemented, and evaluated this focused EBM curriculum for trainees in neonatal-perinatal medicine. Methods: This EBM curriculum consists of seven weekly 1-hour sessions. Specific EBM skills taught in the sessions include formulating a structured clinical question, conducting an efficient literature search, critically appraising published literature in both intervention and diagnostic studies, and incorporating evidence into clinical decision-making. The course was evaluated by a neonatology-adapted Fresno test (NAFT) and neonatology case vignettes, which were administered to learners before and after the curriculum. This publication includes the needs assessment survey, PowerPoint slides for the seven sessions, the NAFT, and the scoring rubric for the test. Results: The NAFT was internally reliable, with a Cronbach’s alpha of .74. The intraclass correlation coefficient of the three raters’ variability in assessment of learners was excellent at .98. Mean test scores increased significantly (54 points, p < .001) in 14 learners after the EBM curriculum, indicating an increase in EBM-related knowledge and skills. Discussion: This focused EBM curriculum enhances trainees’ knowledge and skills and fosters evidence-based practice. The curriculum can be easily adapted for learners in pediatrics, as well as family medicine, in order to enhance trainees’ EBM skills and knowledge.
- Formulate a PICOT (patient/population, intervention, comparison, outcome, and type of study/type of question) clinical question.
- Perform an efficient literature search.
- Critically appraise literature.
- Incorporate evidence into clinical decision-making.
Evidence-based medicine (EBM) is defined as the integration of the best research evidence, clinical expertise, and patient values.1 Teaching EBM to medical students, residents, and faculty has been shown to be effective in improving EBM-based knowledge and skills.2-6 EBM skill in evaluating evidence has been recommended as a core competency by the Accreditation Council for Graduate Medical Education (ACGME) in both pediatric and nonpediatric specialties.7 In the United States, evaluating evidence is described as a necessary competency for medical students by the Liaison Committee on Medical Education,8 but most medical students fare poorly on EBM skills.9 Hence, revisiting EBM teaching after medical school, during residency or fellowship, may be particularly important. Compared to internal medicine and other disciplines, pediatric residents have fewer resources or opportunities to hone their skills in EBM.10 Very few studies have reported on teaching EBM to pediatric residents, and fewer still have focused on trainees in pediatric subspecialties.11 Sensing a real and urgent need for teaching and incorporating EBM in neonatal-perinatal medicine, we designed and implemented this project. We adapted the Fresno test for neonatal practice in our evaluation of the curriculum.
The overall goal is to make this EBM curriculum and the assessment instrument available for other educators in neonatal-perinatal medicine, other pediatric specialties, and family medicine.
Development of the EBM Curriculum
We developed this curriculum using the six-step approach to curriculum development described by Kern, Thomas, and Hughes.12 The six steps include identifying whether a problem exists or not (general needs assessment), a targeted needs assessment, goals and objectives, educational strategy of curriculum design, implementation, and appraisal and feedback. We identified an unmet need for EBM teaching and performed a general needs assessment by interactions with trainees. We then performed a specific needs assessment for EBM using a questionnaire reported by Hadley, Wall, and Khan (Appendix A).13 They collected validity evidence on this questionnaire in a study of 317 junior doctors working in hospitals in the United Kingdom who were at a level of clinical training similar to that of the trainees in our fellowship program. The questionnaire assessed knowledge and beliefs related to EBM with six and 10 questions, respectively. All questions were scored using Likert-scale scores of 1 to 6. We administered this questionnaire to 18 fellows and had 13 responses. The results of the questionnaire are presented in Figures 1 and 2. Fellows’ self-assessment yielded poor scores, with a median of less than 3, indicating fellows’ lack of confidence in skills and knowledge related to EBM. Fellows believed that EBM was essential and that systematic reviews were the key. They also perceived a need for more EBM training.
learners using a 6-point Likert scale (1 = no confidence at all, 6 = feeling very confident).
Scores are presented in box plots with median and interquartile error bars.
The neonatal fellowship program at Texas Children’s Hospital and Baylor College of Medicine had six fellows for each year of the 3 years of the program. We developed an EBM curriculum relevant to fellow trainees in neonatal-perinatal medicine designed to teach core EBM skills grounded in adult learning theory.14 We incorporated the adult learning theory concepts of self-direction, self-motivation, and individualized learning strategies into our course. We discussed our study protocol with the institutional review board at Baylor College of Medicine and were granted exemption from a full evaluation.
learners using a 6-point Likert scale (1 = disagree with the statement, 6 = strongly agree).
The scores are presented in box plots with median and interquartile error bars.
This EBM curriculum was implemented in seven 1-hour sessions. Home assignments were given to emphasize and practice the concepts taught in the sessions. We incorporated small-group interactive discussions and encouraged learners to be educators.
All the sessions were taught in small groups, with a group of four at each table. Participation at each session ranged from 16 to 18 learners. PowerPoint slides were used as a backdrop, and groups were allowed to discuss each concept and to interact with the educator.
Learners were introduced to the concepts of EBM and focused on formulating the PICOT (patient/population, intervention or exposure, comparison, outcomes, and type of study/type of question) clinical question (Appendix B). Homework assignments involved clinical scenarios and formulating clinical questions in the PICOT format.
This first session consisted of didactics and a group activity. The session started with introductions of learners and educators and the format of the curriculum (10 minutes). This was followed by the introduction of the concepts and purpose of EBM (10 minutes), delineating sources of evidence, levels of evidence, and grading evidence using the GRADE methodology (15 minutes). Further discussions included discussion of the 5A steps (ask, advise, assess, assist, and arrange) of EBM and formulation of the PICOT question (10 minutes). After the didactics, learners worked in groups with the scenarios on PowerPoint slides 17 and 18 of Appendix B. Participants discussed among themselves and formulated the answers to the questions. Each group representative updated the class on the group’s discussions and answers (15 minutes).
Second and Third Sessions
The second and third sessions (Appendix C) introduced trainees to sources of medical literature and how to develop efficient search strategies with help from the librarian. Resources that were discussed included PubMed, MEDLINE, the Cumulative Index to Nursing and Allied Health Literature, and evidence-based summary sources such as the Cochrane Database of Systematic Reviews. Homework assignments included searching for literature for formulated clinical questions from the previous homework.
In the second session, the instructor reiterated the sources of evidence and introduced the class to locally available databases and other literature sources (10 minutes). Then, the local librarian demonstrated to the groups how to access and use the locally available databases (15 minutes). Next, the librarian demonstrated building search strategies in PubMed and how to adapt the search strategy to other databases (15 minutes). The class then worked in groups using the scenario on slide 10 of Appendix C and searched in two or three databases (15 minutes). The educator then introduced the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) flow diagram and showed how to fill in the boxes of the PRISMA diagram (5 minutes).
The third session started by revisiting the main highlights of the second session (5 minutes). Groups worked on the PRISMA flow diagram using the number of articles retrieved for the scenario on slide 10 in the previous session, and their work was discussed with the whole classroom (15 minutes). Groups discussed the problems encountered during the searches and interacted with the educators (10 minutes). Then, groups did literature searches for scenarios on slide 17 of Appendix C using two to three databases. The number of articles retrieved was entered in the PRISMA flow diagram and presented to the whole classroom (15 minutes). Groups repeated literature searches for scenarios on slide 19 of Appendix C using two to three databases, and the number of articles retrieved was entered in the PRISMA flow diagram and presented to the whole classroom (15 minutes).
The fourth session (Appendix D) focused on critical appraisal of intervention studies, including internal validity, external validity, significance, and relevance. Learners focused on calculation of relative risks, odds ratios, confidence intervals, absolute risk, risk reduction, and number needed to treat. Homework assignments included critical appraisal of clinical studies and calculation of appropriate indices and estimates.
The session started with an introduction to critical appraisal of literature (5 minutes), followed by a discussion on the concepts of internal validity and external validity (10 minutes) and a discussion covering estimations of relative risk, 95% confidence intervals, odds ratios, risk difference, and number needed to treat (10 minutes). Groups then worked on critically appraising a randomized controlled trial and an observational study (25 minutes). Each group presented its appraisal to the whole classroom, followed by discussions (10 minutes).
Fifth and Sixth Sessions
The fifth and sixth sessions (Appendix E) focused on critical appraisal of studies of diagnostic tests, including calculations of sensitivity, specificity, predictive values, and likelihood ratios. Homework assignments included critical appraisal of diagnostic studies and calculation of appropriate indices and estimates.
The fifth session started with an introduction to the characteristics of diagnostic tests, namely, sensitivity, specificity, predictive values, and likelihood ratios (20 minutes). Educators then demonstrated the use of pretest probability and how to use the Fagan nomogram to derive posttest probability (20 minutes). Groups next discussed the use of the Fagan nomogram, utilizing a clinical example from slide 15 of Appendix E (20 minutes).
The sixth session started with revisiting the highlights from the fifth session (5 minutes). Educators then demonstrated and discussed the use of receiver operating characteristic curves for outcomes of diagnostic tests that are continuous, such as beta natriuretic peptide for pulmonary hypertension in neonates with bronchopulmonary dysplasia (15 minutes). QUADAS-2, the revised Quality Assessment of Diagnostic Accuracy Studies tool, was introduced, and its components were discussed with the class (15 minutes). The class then worked in groups assessing an article using the QUADAS-2 instrument and presented results to the whole classroom, followed by discussions (25 minutes).
The seventh session (Appendix F) was a quiz session that reinforced EBM knowledge (60 minutes). Using groups of four as teams, the quiz was conducted following strict rules. A correct answer to a question directed to the group got 10 points or, if the group passed on the question and it was answered by another team, 5 points. Quiz questions had been discussed for nonambiguity and clarity by the teaching faculty before the session. Concepts taught in the previous sessions and related to the overall objectives of the curriculum were reinforced and clarifications provided. Group learning was also reinforced by the quiz and discussion.
Evaluation of the EBM Curriculum
Self-reporting of EBM skills and knowledge is not reliable, and hence, to evaluate EBM knowledge and skills, we used the validated Fresno test,15 which has been adapted to other medical disciplines.16-20 An inherent advantage in using the Fresno test is that both novices and experts can be assessed since neither a ceiling effect nor a floor effect has been reported.15 The Fresno test is a 12-question tool requiring respondents to formulate targeted research questions, design a search strategy, appraise study quality, and calculate basic statistical summary measures from clinical scenarios using short-answer and brief-essay formats. The tool was initially developed and tested with residents and faculty in family medicine, but since then, validation evidence has been collected among trainees in pediatrics, psychiatry, and internal medicine.16-20
Since the Fresno test assesses EBM knowledge and skills with clinical vignettes and questions, we adapted the test using clinical vignettes and scenarios relevant to neonates (Appendix G). We also adapted and simplified the Fresno test’s scoring rubric to reflect the neonatal clinical vignettes (Appendix H). An answer sheet is included in this publication (Appendix I). We tested the internal consistency and interrater variability of the neonatology-adapted Fresno test (NAFT) by using the Cronbach’s alpha and intraclass correlation coefficient metrics, respectively. Learners were administered the NAFT before and after the EBM curriculum, and responses were graded independently by the three investigators.
The internal consistency of the NAFT by the Cronbach’s alpha metric was .739, which is greater than the value of .600 considered to be the threshold for acceptable reliability.21 Interrater reliability of the three raters’ total scores was assessed using the intraclass correlation coefficient.22,23 The intraclass correlation coefficient was computed using R statistical software, version 3.3.0. We obtained an intraclass correlation coefficient of .983 (95% CI, 96.9-99.2), which is considered excellent according to Fleiss, Levin, and Paik.23
To assess the learners’ improvement in knowledge and skills, we compared the NAFT scores of the 14 learners before and after the course by a paired t test. There were five trainees in the first year of fellowship, four in the second year, and five in the third year. Learners who did not complete both pre- and postcourse evaluations were excluded (n = 4). The mean of the three raters’ total test scores increased by an average of 54 points (score M ± SD: precourse 98.5 ± 27.6, postcourse 152.5 ± 26.4), which was a statistically significant improvement (paired t test: p < .001; Figure 3).
increased learners’ scores on the neonatology-adapted
Fresno test (NAFT). The graph represents box and
whisker plots of precourse and postcourse NAFT scores
with the median and the 25th and 75th centiles. The error
bars indicate the minimum and maximum scores, and the
+ indicates the mean score.
We also assessed the effect of year of fellowship on the scores (Figure 4). The average scores of the 14 learners rated by the three independent assessors were compared before and after the implementation of the EBM curriculum by year of fellowship. Year of fellowship exhibited a significant inverse relationship (Spearman’s ρ = −.587, p = .027) with change in test scores, but year of fellowship was not significantly associated with precurriculum scores (Spearman’s ρ = .461, p = .097) or postcurriculum scores (Spearman’s ρ = −.335, p = .241).
Fresno test (NAFT) varied by year of fellowship. The graph represents
box and whisker plots of precourse and postcourse NAFT scores with
the median and the 25th and 75th centiles. The error bars indicate the
minimum and maximum scores, and the + indicates the mean score.
We assessed the questionwise association with improvement in scores to analyze which questions demonstrated the greatest knowledge gaps and which showed the most improvement after the course. For studies about therapy, the lowest scores precourse occurred with questions about assessing validity, relevance, and statistical significance of effects (questions 5-7) as well as with a question about estimating relative risk and significance of confidence intervals (question 8). For studies about diagnosis and prognosis, the lowest scores were noted with questions about ideal study design (questions 11-12). After the course, considerable improvement was noted in formulating the PICOT questions, searching information sources, search strategies, study-type selection, study-type validity, relevance, and significant effects (questions 5-7). No such improvement was noted with questions about estimating relative risk (question 8), significance of confidence intervals (question 8), or ideal study designs for diagnostic and prognostic studies (questions 11-12).
Our curriculum improved EBM knowledge and skills in trainees and has the potential to be used widely in other neonatal-perinatal medicine programs in the country. This condensed and focused curriculum can be easily adapted to other medical or surgical specialties. The adaptation of educational curricula to the subspecialty context is essential to improve application of the EBM concepts at the bedside. We are not aware of another specialty-adapted Fresno test for the assessment of learners.
The ACGME lists EBM as a core competency, which made us review EBM teaching in our fellowship program in neonatal-perinatal medicine. Prior to this course, the only formal but limited exposure to the concepts of EBM in our section was the journal club, presented once a month, where trainees appraised an article of current relevance and importance. However, further focused education on EBM was necessary, as indicated by our needs assessment. Our short curriculum on EBM significantly improved many aspects of the learners’ skills and knowledge over a period of seven sessions. The curriculum specifically addressed two ACGME competencies: medical knowledge and practice-based learning and improvement.
We reflected on our work using the reflective technique criterion from Glassick’s steps of educational scholarship.24 We adapted the Fresno test for learners in neonatal-perinatal medicine using specialty-specific clinical scenarios. We found the NAFT to be internally consistent and to have good consistency among three raters of the test responses. Validation evidence has been collected for the original Fresno test among learners in pediatrics and other specialties.16-20 The structure and format of the NAFT are the same as the original Fresno test, with only the content changed, so we believe that evidence for construct validity from the original Fresno test still applies to the NAFT. Some validation evidence comes from the fact that our third-year fellows, who had more clinical experience and had presented EBM topics before in, for example, journal clubs, scored higher than first-year fellows. This was the first time the test had been used, and in future studies, we will collect validation evidence to assess the appropriateness and the interpretation of the NAFT score based on established frameworks.25
The limitations of our study are that it is a single-center study of fellows from one specialty with a knowledge evaluation performed at the end of the course. Therefore, how our findings generalize to other subspecialties, to long-term retention of EBM knowledge, and to its application in practice is to be determined. Fourteen learners participated in this course, but improvement was more pronounced in first-year fellows compared to third-year fellows. In the coming years, we will consider enrolling only first-year fellows in this course and offer a brief refresher course for third-year fellows. We will also focus on EBM questions in the NAFT that have the lowest pretest scores (i.e., that the learners have the least knowledge about) to be more effective. Our course was time-intensive (7 hours spread over 7 weeks), especially for our trainees who had clinical and research work to do. The course also needed a librarian to discuss search strategies and access to the databases. We were unable to evaluate the effects of this course on learner behavior towards evidence-based clinical practice and long-term retention of knowledge, which we intend to address in future studies. We intend to extend the EBM curriculum to other learners in our section of neonatology, including nurse practitioners, respiratory therapists, and nurse educators, who are important stakeholders in evidence-based practice.
The curriculum and NAFT presented here can be easily adapted to other pediatric specialties and family medicine by replacing the neonatal clinical vignettes with specialty-specific vignettes. We discussed each of the clinical vignettes in our group, revising them till they were unambiguous and related to our learning objectives. Similar discussions or trialing out on sample group learners will enhance reliability of the curriculum and adapted Fresno test for other pediatric specialties.
The goal of courses such as ours should be long-term retention of EBM skills, which results from daily practice and integration into daily clinical activities.26-28 Journal clubs have been reported to improve evidence-based knowledge and skills in participants,29 and we have since restructured our journal club to better integrate the learned skills of the EBM curriculum. In this revised journal club, fellows pose a PICOT question, perform a literature search to answer the question, and evaluate one of the articles from their search. They continued to analyze the study design, methodology, and statistical analysis with a statistician, but the new structure provides an opportunity to integrate their new skills and be involved in peer discussions. Peer teaching and discussion have been shown to enhance confidence and use of EBM resources.30
In conclusion, we have reported here the impact of a short course on EBM for trainees in neonatal-perinatal medicine that improved trainees’ knowledge and skills in EBM. We found that the NAFT has good internal consistency and excellent interrater reliability. We will collect evidence for validity of the NAFT in future studies.
None to report.
None to report.
Reported as not applicable.
- Straus SE, Glasziou P, Richardson WS, Haynes RB. Evidence-Based Medicine: How to Practice and Teach It. 4th ed. Philadelphia, PA: Churchill Livingstone/Elsevier; 2011.
- Smith CA, Ganschow PS, Reilly BM, et al. Teaching residents evidence-based medicine skills: a controlled trial of effectiveness and assessment of durability. J Gen Intern Med. 2000;15(10):710-715. https://doi.org/10.1046/j.1525-1497.2000.91026.x
- Ross R, Verdieck A. Introducing an evidence-based medicine curriculum into a family practice residency—is it effective? Acad Med. 2003;78(4):412-417. https://doi.org/10.1097/00001888-200304000-00019
- Dorsch JL, Aiyer MK, Meyer LE. Impact of an evidence-based medicine curriculum on medical students’ attitudes and skills. J Med Libr Assoc. 2004;92(4):397-406.
- Nicholson LJ, Warde CM, Boker JR. Faculty training in evidence-based medicine: improving evidence acquisition and critical appraisal. J Contin Educ Health Prof. 2007;27(1):28-33.
- Kim S, Willett LR, Murphy DJ, O’Rourke K, Sharma R, Shea JA. Impact of an evidence-based medicine curriculum on resident use of electronic resources: a randomized controlled study. J Gen Intern Med. 2008;23(11):1804-1808. https://doi.org/10.1007/s11606-008-0766-y
- Accreditation Council for Graduate Medical Education. ACGME program requirements for graduate medical education in pediatrics. https://www.acgme.org/Portals/0/PFAssets/ProgramRequirements/320_pediatrics_2017-07-01.pdf. Accessed March 5, 2015.
- LCME. Functions and structure of a medical school: standards for accreditation of medical education programs leading to the MD degree. http://lcme.org/wp-content/uploads/filebase/standards/2018-19_Functions-and-Structure_2017-08-02.docx. Published March 2017. Accessed August 3, 2017.
- Schwartz A, Hupert J. Medical students’ application of published evidence: randomised trial. BMJ. 2003;326(7388):536-538. https://doi.org/10.1136/bmj.326.7388.536
- Guyatt G. Foreword. In: Moyer VA, Elliott EJ, eds. Evidence-Based Pediatrics and Child Health. 2nd ed. London, England: BMJ Books; 2004:xvii-xviii.
- Dinkevich E, Markinson A, Ahsan S, Lawrence B. Effect of a brief intervention on evidence-based medicine skills of pediatric residents. BMC. 2006;6:1. https://doi.org/10.1186/1472-6920-6-1
- Kern DE, Thomas PA, Hughes MT, eds. Curriculum Development for Medical Education: A Six-Step Approach. 2nd ed. Baltimore, MD: Johns Hopkins University Press; 2009.
- Hadley JA, Wall D, Khan KS. Learning needs analysis to guide teaching evidence-based medicine: knowledge and beliefs amongst trainees from various specialities. BMC. 2007;7:11. https://doi.org/10.1186/1472-6920-7-11
- Green ML, Ellis PJ. Impact of an evidence-based medicine curriculum based on adult learning theory. J Gen Intern Med. 1997;12(12):742-750. https://doi.org/10.1046/j.1525-1497.1997.07159.x
- Ramos KD, Schafer S, Tracz SM. Validation of the Fresno test of competence in evidence based medicine. BMJ. 2003;326(7384):319-321. https://doi.org/10.1136/bmj.326.7384.319
- McCluskey A, Bishop B. The Adapted Fresno Test of competence in evidence-based practice. J Contin Educ Health Prof. 2009;29(2):119-126. https://doi.org/10.1002/chp.20021
- Thomas RE, Kreptul D. Systematic review of evidence-based medicine tests for family physician residents. Fam Med. 2015;47(2):101-117.
- Grad R, Macaulay AC, Warner M. Teaching evidence-based medical care: description and evaluation. Fam Med. 2001;33(8):602-606.
- Stern DT, Linzer M, O’Sullivan PS, Weld L. Evaluating medical residents’ literature-appraisal skills. Acad Med. 1995;70(2):152-154. https://doi.org/10.1097/00001888-199502000-00021
- Shaneyfelt T, Baum KD, Bell D, et al. Instruments for evaluating education in evidence-based practice: a systematic review. JAMA. 2006;296(9):1116-1127. https://doi.org/10.1001/jama.296.9.1116
- Nunnally JC, Bernstein IH. Psychometric Theory. 3rd ed. New York, NY: McGraw-Hill; 1994.
- Shrout PE, Fleiss JL. Intraclass correlations: uses in assessing rater reliability. Psychol Bull. 1979;86(2):420-428. https://doi.org/10.1037/0033-2909.86.2.420
- Fleiss JL, Levin B, Paik MC. Statistical Methods for Rates and Proportions. Hoboken, NJ: John Wiley & Sons; 2003:598-626.
- Glassick CE. Boyer’s expanded definitions of scholarship, the standards for assessing scholarship, and the elusiveness of the scholarship of teaching. Acad Med. 2000;75(9):877-880. https://doi.org/10.1097/00001888-200009000-00007
- Cook DA, Hatala R. Validation of educational assessments: a primer for simulation and beyond. Adv Sim (Lond). 2016;1:31. https://doi.org/10.1186/s41077-016-0033-y
- Dorsch JL, Aiyer MK, Gumidyala K, Meyer LE. Retention of EBM competencies. Med Ref Serv Q. 2006;25(3):45-57. https://doi.org/10.1300/J115v25n03_04
- Coomarasamy A, Khan KS. What is the evidence that postgraduate teaching in evidence based medicine changes anything? A systematic review. BMJ. 2004;329(7473):1017. https://doi.org/10.1136/bmj.329.7473.1017
- Lai NM, Teng CL. Competence in evidence-based medicine of senior medical students following a clinically integrated training programme. Hong Kong Med J. 2009;15(5):332-338.
- Mohr NM, Stoltze AJ, Harland KK, Van Heukelom JN, Hogrefe CP, Ahmed A. An evidence-based medicine curriculum implemented in journal club improves resident performance on the Fresno test. J Emerg Med. 2015;48(2):222-229.e1. https://doi.org/10.1016/j.jemermed.2014.09.011
- Rees E, Sinha Y, Chitnis A, Archer J, Fotheringham V, Renwick S. Peer-teaching of evidence-based medicine. Clin Teach. 2014;11(4):259-263. https://doi.org/10.1111/tct.12144
This is an open-access article distributed under the terms of the Creative Commons Attribution-NonCommercial-No Derivatives license.
Received: August 9, 2017
Accepted: November 26, 2017