Original Publication
Open Access

Critical Appraisal Worksheets for Integration Into an Existing Small-Group Problem-Based Learning Curriculum

Published: February 14, 2018 | 10.15766/mep_2374-8265.10682

Appendices

  • CAW Wk 1 Systematic Review.docx
  • CAW Wk 2 Basic Science.doc
  • CAW Wk 3 Case Control.doc
  • CAW Wk 4 Case Report.docx
  • CAW Wk 5 Cohort.docx
  • CAW Wk 6 Case Series.docx
  • CAW Wk 7 Randomized Controlled Trial.docx
  • CAW Wk 8 Pharmaceutical Trial.docx
  • Answer Key Wk 1 Systematic Review.docx
  • Answer Key Wk 2 Basic Science.doc
  • Answer Key Wk 3 Case Control.doc
  • Answer Key Wk 4 Case Report.docx
  • Answer Key Wk 5 Cohort.docx
  • Answer Key Wk 6 Case Series.docx
  • Answer Key Wk 7 Randomized Controlled Trial.docx
  • Answer Key Wk 8 Pharmaceutical Trial.docx
  • Student Feedback Survey.docx
  • Faculty Feedback Survey.docx

All appendices are peer reviewed as integral parts of the Original Publication.

To view all publication components, extract (i.e., unzip) them from the downloaded .zip file.


Abstract

Introduction: Since medical students matriculate with a diversity of backgrounds, there can exist a disparity in student ability to critically appraise health science literature. To address this, we developed a critical appraisal exercise and integrated it into the first-year problem-based learning (PBL) curriculum at Case Western Reserve University School of Medicine. Methods: For 8 weeks, first-year medical students read a weekly preselected health science literature article relating to the content of their PBL curriculum and completed a critical appraisal worksheet consisting of questions regarding study design and result interpretation. Students discussed the article and worksheet within PBL small groups. Faculty facilitators were given the critical appraisal worksheet answer key, which students later gained access to after the discussion. To measure changes in critical appraisal skills, a voluntary questionnaire based on the Berlin questionnaire, a validated tool for measuring knowledge and skills in evidence-based medicine, was administered before and after the 8-week intervention. Results: Using paired Student t tests, we found that the students who completed both questionnaires (N = 60) showed an average improvement of 4% (p = .03). Students who scored at or below the 50th percentile on the preintervention questionnaire showed an average improvement of 12% (p = .002). Discussion: These critical appraisal worksheets are easily adaptable to an existing PBL curriculum and are an effective tool for improving and teaching critical appraisal skills in those students who will benefit most.


Educational Objectives

By the end of this activity, learners will be able to:

  1. Use critical appraisal worksheets (CAWs) to critically appraise health science literature to establish whether a health science research study addresses a clearly focused question.
  2. Use CAWs to critically appraise health science literature to establish whether a health science research study uses valid methods to address this question.
  3. Use CAWs to critically appraise health science literature to establish whether a health science research study’s valid results are important.
  4. Use CAWs to critically appraise health science literature to establish whether a health science research study’s valid and important results are applicable to their patient population.

Figures and Tables

Table 1. Worksheets by PBL Curriculum Topic and Study Type
Week
PBL Curriculum Topic
Study Type
1
Insulin and glucagon physiology
Systematic review
2
Male reproductive physiology
Basic science/animal model
3
Thyroid hormone physiology
Case control study
4
Hormonal changes in pregnancy
Case report
5
Fetal development
Cohort study
6
Inheritance and genetic diseases
Case series
7
Oncogenes
Randomized controlled trial
8
Tumor suppressor genes
Pharmaceutical trial
Abbreviation: PBL, problem-based learning.
Figure 1. Number of correct answers on Berlin questionnaire pretest and posttest
among all students (N = 60).
Figure 2. Number of correct answers on Berlin questionnaire pretest and posttest
among students who scored at or below the 50th percentile on the pretest and
completed both tests (N = 30).
Table 2. Student Responses to Voluntary Feedback Survey (N = 60)
Question
No. (%)
Strongly Disagree
Disagree
Neutral
Agree
Strongly Agree
CAWs provided me the opportunity to actively apply basic statistical knowledge.
5 (8.3)
5 (8.3)
17 (28.3)
21 (35.0)
12 (20.0)
CAWs encouraged me to use a structured approach to the critical appraisal of health science literature.
1 (1.7)
6 (10.0)
14 (23.3)
19 (31.7)
20 (33.3)
CAWs encouraged me to apply findings in a health science literature article to the patient in the PBL case.
1 (1.7)
11 (18.3)
15 (25.0)
19 (31.7)
14 (23.3)
CAW discussion was burdensome on the PBL small group and limited the time needed for the other topics covered in the PBL session.
9 (15.0)
12 (20.0)
15 (25.0)
13 (21.7)
11 (18.3)
Abbreviations: CAW, critical appraisal worksheet; PBL, problem-based learning.
Table 3. Faculty Facilitator Responses to Voluntary Feedback Survey (N = 15)
Question
No. (%)
Strongly Disagree
Disagree
Neutral
Agree
Strongly Agree
CAWs provided students the opportunity to actively apply basic statistical knowledge.
0 (0.0)
0 (0.0)
0 (0.0)
10 (66.7)
5 (33.3)
CAWs encouraged students to use a structured approach to the critical appraisal of health science literature.
0 (0.0)
0 (0.0)
1 (6.7)
2 (13.3)
12 (80.0)
CAWs encouraged students to apply findings in a health science literature article to the patient in the PBL case.
0 (0.0)
1 (6.7)
1 (6.7)
7 (46.7)
6 (40.0)
CAW discussion was burdensome on the PBL small group and limited the time needed for the other topics covered in the PBL session.
4 (26.7)
7 (46.7)
3 (20.0)
0 (0.0)
1 (6.7)
Compared to the previous PBL EBM activity, do you believe the CAW is:
    More effective at providing students the opportunity to actively apply basic     statistical knowledge.
0 (0.0)
0 (0.0)
3 (20.0)
6 (40.0)
6 (40.0)
    More effective at encouraging students to use a structured approach to     critical appraisal of health science literature.
0 (0.0)
0 (0.0)
1 (6.7)
8 (53.3)
6 (40.0)
    More effective at encouraging students to apply findings presented in a     health science literature case.
0 (0.0)
0 (0.0)
3 (20.0)
8 (53.3)
4 (26.7)
Abbreviation: CAW, critical appraisal worksheet; EBM, evidence-based medicine; PBL, problem-based learning.

Introduction

The Association of American Medical Colleges identifies “basic skills in critiquing the quality of evidence and assessing the applicability to their patients and the clinical contest” as a core entrustable professional activity for entering residency.1 However, studies show that medical residents frequently lack basic critical appraisal skills necessary for interpreting biomedical literature.2 These skills are critical for the advancement of research findings into clinical application and for the provision of optimal patient care. Critical appraisal is a foundational skill of evidence-based medicine (EBM) that can be effectively taught within a preclinical curriculum.

EBM is defined as “the conscientious, explicit, judicious and reasonable use of modern, best evidence in making decisions about the care of individual patients.”3 Undergraduate medical students and teachers recognize the importance of learning EBM, as well as that integrated EBM courses are a non-resource-intensive way of teaching this discipline.4 A recent multi-institutional case study identified four learning challenges that may impede student ability to learn EBM skills: suboptimal role models, student lack of willingness to admit uncertainty, lack of clinical context, and difficulty mastering EBM skills.5 The study recommended several interventions, including integrating EBM skills into other content, incorporating clinical content into EBM teaching, and providing training to faculty, to meet these challenges.5 Previous studies considered EBM teaching methods, including lecture-based, small-group, and one-on-one formats, as well as computer and web-based modules; for most, these interventions yielded a small improvement in students’ EBM skill performance.6 This has led some authors to conclude that a multifaceted approach may be most effective.7

The EBM process requires five distinct steps: (1) asking a clear question, (2) acquiring the evidence, (3) appraising the quality of the evidence, (4) applying the evidence to care, and (5) assessing the effectiveness of the intervention.8 This module focuses on step 3 of the EBM approach. Broadly defined, critical appraisals involve the systematic evaluation of clinical research papers in order to establish if a study addresses a clearly focused question, it if uses valid methods to address the question, if the valid results of the study are important, and if the valid and important results are applicable to relevant patient populations.9

At Case Western Reserve University School of Medicine (CWRU SOM), we developed a series of critical appraisal worksheets (CAWs) based largely on resources available from the Center for Evidence-Based Medicine (CEBM).9 These eight worksheets are intended to serve as an adjunct to a preestablished, problem-based learning (PBL), small-group curriculum centering on learning objectives and student discussion of weekly patient cases. The worksheets provide opportunities for medical students to learn and practice how to critically appraise a variety of health science research studies. CWRU SOM introduces many EBM concepts during the first unit of the preclinical curriculum through 10 hours of epidemiology and biostatistics lectures. Before the use of CAWs, EBM skills were further developed through weekly student-led journal article presentations in PBL small groups. In preparation for these presentations, a student was expected to identify a question related to a weekly PBL patient case, find an article that addressed this question, critically appraise the article, and apply his or her findings to the patient case. The student would then present his or her findings to the PBL small group and facilitate a brief (5- to 7-minute) discussion. It was challenging to engage students in this loosely structured activity, and they were frequently unsure of how to identify a valid article and properly appraise it. In contrast, the CAWs have been designed to give students a structured activity with definitive examples of accurate critical appraisal for a variety of study designs.

While the CAWs have been specifically designed to complement the CWRU SOM curricular unit covering molecular biology, endocrinology, development, genetics, reproductive biology, and cancer biology, they could be easily adapted to other topics within a preclinical curriculum.

Methods

Each week for 8 weeks, one worksheet (Appendices A-H) was shared with students in advance of a PBL small-group session. The worksheet contained a citation for a preselected health science research study. Students were assigned to read the health science research study and complete the worksheet before attending the PBL session. The PBL session consisted of eight to 10 students and one faculty facilitator discussing patient cases. A brief portion (5-7 minutes) of group time was reserved for students to use the worksheet as a template to discuss the health science research study and resolve their worksheet answers. Students determined the PBL session schedule weekly, so more time could be allotted for discussion. Faculty facilitators were assigned to read the health science research study as well as a CAW answer key (Appendices I-P) so that they could support a productive group discussion. The CAW answer key was made available to students after the PBL session concluded. Faculty facilitators were also asked to confirm that each student in the group had completed the worksheet prior to attending the PBL session.

The health science research articles used with the CAWs were selected based on several criteria. Worksheet designers worked with the leaders of the existing PBL curriculum to identify studies that related to the learning objectives of the relevant weekly PBL session. Certain studies were also chosen to cover a range of study types (Table 1). Because of constraints on the students’ existing workload, an effort was also made to select studies with manuscripts under five pages in length.

Table 1. Worksheets by PBL Curriculum Topic and Study Type
Week
PBL Curriculum Topic
Study Type
1
Insulin and glucagon physiology
Systematic review
2
Male reproductive physiology
Basic science/animal model
3
Thyroid hormone physiology
Case control study
4
Hormonal changes in pregnancy
Case report
5
Fetal development
Cohort study
6
Inheritance and genetic diseases
Case series
7
Oncogenes
Randomized controlled trial
8
Tumor suppressor genes
Pharmaceutical trial
Abbreviation: PBL, problem-based learning.

The worksheets consisted of six to eight questions that asked the student to assess purpose, validity, importance, and applicability of a particular study. Under each question, there was a passage offering guidance on what a study should ideally contain in order to achieve a high standard of legitimacy.

The worksheets were adapted from CAWs designed by CEBM.9 CEBM offers worksheets to guide the critical appraisal of systematic reviews, diagnostic studies, prognostic studies, and randomized controlled trails. The CEBM worksheets are general templates intended to be applicable to any study of the relevant study design. We altered these CEBM worksheets in order to tailor the questions to specific studies. Additionally, for study types not covered by existing CEBM worksheets (e.g., case report), we worked with CWRU SOM faculty to develop worksheets or relied on other critical appraisal resources to create worksheets that followed the same outline as the CEBM worksheets. For example, the case report and case series worksheets (weeks 4 and 6, respectively) were adapted from a resource available from the Journal of Medical Case Reports.10

To measure changes in critical appraisal skills, a voluntary questionnaire based on the Berlin questionnaire,11-16 a validated tool for measuring knowledge and skills in EBM that has been used in a number of similar studies, was administered to the students before and after the 8-week intervention. The questionnaire consisted of two sets (Forms A and B) of 15 multiple-choice questions with similar content and was designed to measure deep learning, or one’s ability to apply EBM concepts to new scenarios.13 Form B served as the pretest, and Form A as the posttest. Paired Student t tests were used to compare the number of total correct answers each student scored on the pretest and posttest. Students and faculty facilitators also completed voluntary feedback surveys (Appendices Q & R) eliciting their opinion on the effectiveness of the intervention. Faculty facilitators were additionally asked to compare the intervention to the PBL small-group EBM activity utilized in previous years. The Berlin questionnaire is omitted from this resource at the request of its developers but can be requested from the University of Basel Hospital, Department of Clinical Research. The student and faculty feedback surveys are the only assessments provided in the current resource.

The CWRU Institutional Review Board approved this study.

Results

All first-year medical students in the class of 2020 participated in the CAW activity. The class consisted of 184 students (91 male, 93 female) averaging 23.4 years of age, with an average MCAT score of 35.5. Six students were admitted through a preprofessional scholar program and were not required to take the MCAT exam.

Sixty students completed both the pre- and postintervention voluntary questionnaires. All student pretest and posttest Berlin questionnaire results are shown in Figure 1. Based on Student t-test analyses, students who completed both questionnaires showed an average improvement of 4% (p < .03) in number of correct answers. As shown in Figure 2, students who scored at or below the 50th percentile on the pretest Berlin questionnaire showed an average improvement of 12% (p < .001) in number of correct answers. Data underlying these results are not shown.

Figure 1. Number of correct answers on Berlin questionnaire pretest and posttest
among all students (N = 60).
Figure 2. Number of correct answers on Berlin questionnaire pretest and posttest
among students who scored at or below the 50th percentile on the pretest and
completed both tests (N = 30).

Student and faculty facilitator responses to the voluntary feedback survey are shown in Table 2 and Table 3, respectively. Sixty students completed the voluntary feedback survey eliciting their opinion on the effectiveness of the intervention on their learning experience. Sixty-five percent (n = 39) agreed or strongly agreed that the CAW encouraged them to use a structured approach to the critical appraisal of health science literature. Fifty-five percent (n = 33) agreed or strongly agreed that the CAWs provided them with the opportunity to actively apply basic statistical knowledge. Fifty-five percent (n = 33) agreed or strongly agreed that the CAWs encouraged them to apply findings in a health science literature article to the patient in the PBL case. However, 40% (n = 24) agreed or strongly agreed that the critical appraisal exercise was burdensome on the PBL small group and limited the time needed to discuss the other topics covered in the PBL session.

Table 2. Student Responses to Voluntary Feedback Survey (N = 60)
Question
No. (%)
Strongly Disagree
Disagree
Neutral
Agree
Strongly Agree
CAWs provided me the opportunity to actively apply basic statistical knowledge.
5 (8.3)
5 (8.3)
17 (28.3)
21 (35.0)
12 (20.0)
CAWs encouraged me to use a structured approach to the critical appraisal of health science literature.
1 (1.7)
6 (10.0)
14 (23.3)
19 (31.7)
20 (33.3)
CAWs encouraged me to apply findings in a health science literature article to the patient in the PBL case.
1 (1.7)
11 (18.3)
15 (25.0)
19 (31.7)
14 (23.3)
CAW discussion was burdensome on the PBL small group and limited the time needed for the other topics covered in the PBL session.
9 (15.0)
12 (20.0)
15 (25.0)
13 (21.7)
11 (18.3)
Abbreviations: CAW, critical appraisal worksheet; PBL, problem-based learning.
Table 3. Faculty Facilitator Responses to Voluntary Feedback Survey (N = 15)
Question
No. (%)
Strongly Disagree
Disagree
Neutral
Agree
Strongly Agree
CAWs provided students the opportunity to actively apply basic statistical knowledge.
0 (0.0)
0 (0.0)
0 (0.0)
10 (66.7)
5 (33.3)
CAWs encouraged students to use a structured approach to the critical appraisal of health science literature.
0 (0.0)
0 (0.0)
1 (6.7)
2 (13.3)
12 (80.0)
CAWs encouraged students to apply findings in a health science literature article to the patient in the PBL case.
0 (0.0)
1 (6.7)
1 (6.7)
7 (46.7)
6 (40.0)
CAW discussion was burdensome on the PBL small group and limited the time needed for the other topics covered in the PBL session.
4 (26.7)
7 (46.7)
3 (20.0)
0 (0.0)
1 (6.7)
Compared to the previous PBL EBM activity, do you believe the CAW is:
    More effective at providing students the opportunity to actively apply basic     statistical knowledge.
0 (0.0)
0 (0.0)
3 (20.0)
6 (40.0)
6 (40.0)
    More effective at encouraging students to use a structured approach to     critical appraisal of health science literature.
0 (0.0)
0 (0.0)
1 (6.7)
8 (53.3)
6 (40.0)
    More effective at encouraging students to apply findings presented in a     health science literature case.
0 (0.0)
0 (0.0)
3 (20.0)
8 (53.3)
4 (26.7)
Abbreviation: CAW, critical appraisal worksheet; EBM, evidence-based medicine; PBL, problem-based learning.

The voluntary feedback survey also included two open-ended questions. The first question was, “What did you like about the critical appraisal worksheet exercise?” The students found the CAW helpful in learning to appraise literature correctly for a variety of research designs, as indicated by the following statements:

  • “The [worksheet] was a beneficial guide and helped us ask more meaningful questions about the research, which allowed us to learn more as a result.”
  • “Many of us are new to appraising literature critically and it helped keep us all on the same page.”
  • “We were provided the opportunity to analyze different types of research.”
  • “I appreciated the weekly guidance tailored to specific study types.”

Finally, students commented that the worksheets were “easy to do and relatively stress free.”

The second question was, “How do you think the critical appraisal exercise can be improved?” Students noted that 5-7 minutes was not always sufficient to discuss the worksheet and health science research study. Students also wanted to acquire their own research article instead of having one assigned. Additional feedback included the following comments:

  • “It’s tough to discuss [the worksheet] in the short amount of time allotted.”
  • “Shorter papers are more practical for the time limitations.”
  • “Just searching for worksheet answers rather than critically appraising.”
  • “I think that it is more useful if students can pick their own research article.”
  • “Provide more generalized templates for each [study] type in a consolidated place to [students] to reference in the future.”

Fifteen faculty facilitators completed the voluntary feedback survey eliciting their opinion regarding the effectiveness of the intervention on their learning experience. Ninety-three percent (n = 14) agreed or strongly agreed that the CAW encouraged students to use a structured approach to the critical appraisal of health science literature and that CAWs were more effective in doing this compared to the previously used PBL EBM activity. All of the respondents agreed or strongly agreed that the CAWs provided them with the opportunity to actively apply basic statistical knowledge, and 80% (n = 12) agreed or strongly agreed that CAWs were more effective in doing this compared to the previously used PBL EBM activity. Eighty-seven percent of faculty facilitators agreed or strongly agreed that CAWs encouraged students to apply findings in a health science literature article to the patient in the PBL case, and 80% agreed or strongly agreed that CAWs were more effective in doing this compared to the previously used PBL EBM activity. Only 7% (n = 1) of faculty facilitators agreed or strongly agreed that the critical appraisal exercise was burdensome on the PBL small group and limited the time needed to discuss the other topics covered in the PBL session

Discussion

The CAWs provided students with an easy-to-follow rubric for assessing the purpose, validity, importance, and applicability of a particular study. This was an effective method for giving students the opportunity to practice basic skills in critiquing evidence and assessing applicability to a patient, a core entrustable professional activity that graduate medical students are expected to possess on day one of residency.1 Through this 8-week CAW and discussion series, students improved their ability to critically appraise literature. Notably, those students most deficient in this skill before the intervention demonstrated the greatest improvements.

There are several limitations to this study. First, because the pretests, posttests, and feedback surveys were voluntary, our data did not include all students who participated in the intervention. While our sample size was limited by this fact, we believe the data collected support the value of the educational intervention. Second, our study lacked a control group due to logistical constraints and a commitment to providing the same educational resources to all students. Third, since the goal of the curriculum was to promote a foundational skill of EBM, we felt that the Berlin questionnaire, a validated and reliable measure of EBM knowledge, was an appropriate assessment tool. We considered using the Fresno test,17 another validated assessment tool of EBM skills consisting of short-answer questions based on clinical vignettes. However, we felt that students would be more likely to voluntarily participate in completing the shorter, multiple-choice Berlin questionnaire. In addition, we chose the Berlin questionnaire because it is composed of 15 scenario-based multiple-choice questions that include more extensive arithmetic skills than the Fresno test. Still, the study might have benefited from the additional use of an assessment tool more specific to critical appraisal. This is something we plan to institute in future iterations of our work.

Based on faculty and student surveys, the majority of respondents believed the worksheets provided a structured approach to critical appraisal, encouraged students to apply findings in a health science literature article to the patient in the PBL case, and offered an opportunity to actively apply basic statistical knowledge. Toward these aims, faculty facilitators also largely considered CAWs an improvement over the previous EBM activity within the PBL curriculum.

It is important to note that a minority of faculty and students found the worksheets and discussions burdensome to the PBL session. To address this, we plan to develop formalized faculty facilitator training for the CAW discussion in future classes so that faculty can encourage a time-efficient discussion. We hope to lengthen the amount of time allotted to discuss the CAWs during each PBL session to address student concern that 5-7 minutes was not always a sufficient time period. However, this needs to be balanced with the goals and learning objectives of other components of the PBL curriculum. Another option we have considered is creating an online discussion board where students can post any remaining CAW concerns from a given week. There, content experts and other students could address CAW topics that were not covered in the PBL time allotted.

While the EBM concepts covered by each CAW are unlikely to change significantly from year to year, the content of the preselected health science literature articles may become outdated. Leaders of the curricular unit have agreed to review the preselected articles annually to ensure that they remain relevant and aligned with the goals and learning objectives of the broader PBL curriculum. If an article is deemed to be outdated, it will be replaced with a more appropriate article, and the CAW and corresponding answer key will be revised. The revised CAW will still address the same essential EBM concepts.

In their feedback, students expressed a desire to select health science literature articles themselves rather than relying on preselected ones. We believe that initially using a uniform curriculum with preselected articles and tailored CAWs/answer keys better ensures that all students gain the same foundational skills in critical appraisal. However, developing a patient-centered question and identifying relevant evidence are both crucial EBM skills. Therefore, we are in the process of extending the use of the CAWs into later units in the CWRU SOM curriculum. These later worksheets will provide general templates for critical appraisal of broad study types rather than being tailored to preselected articles. PBL groups will be asked to identify a clinical question related to the content of the established PBL curriculum. One weekly student leader will then select a health science literature article that addresses the group’s clinical question. The student leader will share this article with the remainder of the PBL group as well as identifying the appropriate CAW for the group to use based on study type. Finally, the PBL group will use the CAW as a rubric to appraise the article the group has chosen. This more advanced use of the worksheets will require not only critical appraisal skills but also additional steps in the EBM process such as asking a clear question and acquiring the evidence.

Although these worksheets were developed for the CWRU SOM PBL curriculum, they can be easily integrated at other institutions that rely on PBL small groups or other learner-based team activities such as team-based learning (TBL) or case-based discussion. For example, an article and CAW could be assigned in advance of a TBL session, and the CAW answers could then be debated within and among student groups in a TBL-style discussion. CAWs require a relatively small amount of dedicated classroom time as compared to other methods for teaching EBM, such as lectures and journal club presentations, and these worksheets were designed to be user friendly so that medical students could work through them with relatively little faculty instruction. Similarly, the CAW answer sheets were designed to be straightforward but sufficiently detailed so that faculty would not need extensive training to effectively facilitate CAW discussion. We believe the worksheets are most valuable in small-group settings where thoughtful student discussion can take place. However, in exclusively lecture-based curricula, it would not be inconceivable to assign a CAW and preselected article to students in advance of a lecture and then spend a portion of lecture time reviewing each CAW answer. For closed-ended questions, learning technology such as clicker questions could be employed to support student engagement.


Author Information

  • Jessica O'Neil: Medical Student, Case Western Reserve University School of Medicine
  • Colleen Croniger, PhD: Associate Professor, Department of Nutrition, Case Western Reserve University School of Medicine; Assistant Dean of Medical Student Research and Basic Science Education, Case Western Reserve University School of Medicine

Acknowledgments
We thank the Case Western Reserve University School of Medicine Scholars Collaboration in Teaching and Learning and Amy Wilson-Delfosse, PhD, for providing professional development and financial support to this project; Klara Papp, PhD, for expert consultation on study design; Catherine Demko, PhD, for expert consultation on critical appraisal worksheet development based on her work at the Case Western Reserve University School of Dental Medicine; and Professor Regina Kunz from the University of Basel Hospital, Department of Clinical Research, EbIM Evidence-Based Insurance Medicine, for sharing the Berlin questionnaire.

Disclosures
None to report.

Funding/Support
None to report.

Prior Presentations
O’Neil J, Croniger C. Building a foundation for evidence based practice: implementation of critical appraisal exercises into a problem based learning curriculum. Poster presented at: International Association of Medical Science Educators Annual Meeting; June 10-13, 2017; Burlington, VT.

O’Neil J, Croniger C. Building a foundation for evidence based practice: implementation of critical appraisal exercises into a problem based learning curriculum. Poster presented at: Medical Education Retreat; 2017; Cleveland, OH.

Ethical Approval
The Case Western Reserve University Institutional Review Board approved this study.


References

  1. The core entrustable professional activities (EPAs) for entering residency. Association of American Medical Colleges website. https://www.aamc.org/initiatives/coreepas/. Accessed August 26, 2017.
  2. Windish DM, Huot SJ, Green ML. Medicine residents’ understanding of the biostatistics and results in the medical literature. JAMA. 2007;298(9):1010-1022. https://doi.org/10.1001/jama.298.9.1010
  3. Masic I, Miokovic M, Muhamedagic B. Evidence based medicine—new approaches and challenges. Acta Inform Med. 2008;16(4):219-225. https://doi.org/10.5455/aim.2008.16.219-225
  4. Atwa H, Abdelaziz A. Evidence-based medicine (EBM) for undergraduate medical students: a six-step, integrative approach. Med Teach. 2017;39(suppl 1):S27-S32. https://doi.org/10.1080/0142159X.2016.1254750
  5. Maggio LA. Educating physicians in evidence based medicine: current practices and curricular strategies. Perspect Med Educ. 2016;5(6):358-361. https://doi.org/10.1007/s40037-016-0301-5
  6. Swanberg SM, Dennison CC, Farrell A, et al. Instructional methods used by health sciences librarians to teach evidence-based practice (EBP): a systematic review. J Med Libr Assoc. 2016;104(3):197-208. https://doi.org/10.3163/1536-5050.104.3.004
  7. Kyriakoulis K, Patelarou A, Laliotis A, et al. Educational strategies for teaching evidence-based practice to undergraduate health students: systematic review. J Educ Eval Health Prof. 2016;13:34. https://doi.org/10.3352/jeehp.2016.13.34
  8. Straus SE, Glasziou P, Rishardson WS, Haynes RB. Evidence-Based Medicine: How to Practice and Teach It. 4th ed. New York, NY: Churchill Livingstone Elsevier; 2011.
  9. Critical appraisal tools. Center for Evidence-Based Medicine website. http://www.cebm.net/blog/2014/06/10/critical-appraisal/. Accessed August 26, 2017.
  10. Garg R, Lakhan SE, Dhanasekaran AK. How to review a case report. J Med Case Rep. 2016;10:88. https://doi.org/10.1186/s13256-016-0853-3
  11. Weberschock TB, Ginn TC, Reinhold J, et al. Change in knowledge and skills of Year 3 undergraduates in evidence-based medicine seminars. Med Educ. 2005;39(7):665-671. https://doi.org/10.1111/j.1365-2929.2005.02191.x
  12. Sabouni A, Bdaiwi Y, Janoudi SL, et al. Multiple strategy peer-taught evidence-based medicine courses in a poor resource setting. BMC Med Educ. 2017;17:82. https://doi.org/10.1186/s12909-017-0924-1
  13. Fritsche L, Greenhalgh T, Falck-Ytter Y, Neumayer H-H, Kunz R. Do short courses in evidence based medicine improve knowledge and skills? Validation of Berlin questionnaire and before and after study of courses in evidence based medicine. BMJ. 2002;325(7376):1338-1341. https://doi.org/10.1136/bmj.325.7376.1338
  14. Dawes M, Summerskill W, Glasziou P, et al. Sicily statement on evidence-based practice. BMC Med Educ. 2005;5:1. https://doi.org/10.1186/1472-6920-5-1
  15. Straus SE, Green ML, Bell DS, et al; for Society of General Internal Medicine Evidence-Based Medicine Task Force. Evaluating the teaching of evidence based medicine: conceptual framework. BMJ. 2004;329(7473):1029-1032. https://doi.org/10.1136/bmj.329.7473.1029
  16. West CP, Jaeger TM, McDonald FS. Extended evaluation of a longitudinal medical school evidence-based medicine curriculum. J Gen Intern Med. 2011;26(6):611-615. https://doi.org/10.1007/s11606-011-1642-8
  17. Ramos KD, Schafer S, Tracz SM. Validation of the Fresno test of competence in evidence based medicine. BMJ. 2003:326(7384):319-321. https://doi.org/10.1136/bmj.326.7384.319


Citation

O’Neil J, Croniger C. Critical appraisal worksheets for integration into an existing small-group problem-based learning curriculum. MedEdPORTAL Publications. 2018;14:10682. https://doi.org/10.15766/mep_2374-8265.10682

Received: September 3, 2017

Accepted: January 12, 2018