Original Publication
Open Access

Clinical Reasoning Workshop: Lumbosacral Spine and Hip Disorders

Published: September 20, 2017 | 10.15766/mep_2374-8265.10632

Appendices

  • Lumbosacral Spine and Hip Illness Script Sorting Table.xlsx
  • Introduction Presentation and Schedule.pptx
  • Pre- and Posttest.docx
  • Exemplar Cases.pptx
  • Session Evaluation Questions.docx
  • Lumbosacral Spine and Hip Physical Examination-Refresher and Resources.docx

All appendices are peer reviewed as integral parts of the Original Publication.

To view all publication components, extract (i.e., unzip) them from the downloaded .zip file.


Abstract

Introduction: Helping physicians-in-training develop effective clinical reasoning skills may facilitate progression to expertise, reduce diagnostic errors, and improve patient safety. Using our previous experience, we developed a workshop that reviews musculoskeletal lumbar spine and hip conditions. This workshop also uses deductive and inductive modes of clinical reasoning and provides opportunities for learners to practice toggling from one to another while reviewing. Methods: Using exemplar musculoskeletal vignettes, this workshop allows residents to practice engaging and toggling between both modes of information processing. This workshop also includes pre- and posttests, small-group learning, and a small-group competition. Results: The workshop was implemented with a group of 26 physical medicine and rehabilitation residents. Although residents did well on the pretest, the workshop improved their test performance. Residents liked the workshop and thought it improved their diagnostic ability. Discussion: A workshop that included team- and case-based learning, key features assessment, script theory, and gamification was effective in engaging residents and resulted in high resident satisfaction and the perception of increased ability to tackle clinical problems. Learning from our experience with the previous workshop resulted in significant reduction in faculty time required, and increased the number of residents who were able to complete both pre- and posttests.


Educational Objectives

By the end of this session, learners will be able to:

  1. Identify the correct diagnosis at least 80% of the time when presented with cases of common painful lumbosacral spine and hip conditions (deductive reasoning).
  2. Identify at least one history key feature and at least one physical exam key feature for common lumbosacral spine and hip diagnoses (inductive reasoning).

Introduction

Helping physicians-in-training develop effective clinical reasoning skills may facilitate progression to expertise, reduce diagnostic errors, and improve patient safety.1-3 Physicians need the intellectual flexibility to integrate several forms of knowledge with clinical reasoning to provide the very best patient care.4 Several theories explain how information is processed in physicians’ minds to influence their reasoning processes during medical encounters. According to the script theory, medical knowledge is bundled into networks called “illness scripts” which allow physicians to integrate new incoming information with existing knowledge, identify patterns and deviations in symptom complexes, recognize similarities and differences between disease states, and make predictions about how diseases are likely to unfold.5

Although others have developed a clinical reasoning curriculum6 on MedEdPORTAL, they have not addressed a specific content domain and focused more on the theoretical rather than the applied. Likewise, there is no conclusive evidence to date that simply learning about clinical reasoning may lead one to being better at clinical reasoning.7 A number of neck and shoulder OSCE cases were created for use by internal medicine residents on MedEdPORTAL8. However, the instructional methodology was not supported by the latest theory of clinical reasoning. Further, we have previously published a workshop designed to provide resident physicians with practice in engaging and toggling between two modes of information processing using exemplar case vignettes in the domain of musculoskeletal cervical spine and shoulder disorders.9 The workshop engaged the residents, was well received, and was thought to improve residents’ ability to recognize common presentations of the cervical spine and shoulder musculoskeletal disorders. Our subsequent search of MedEdPORTAL using “lumbar spine” and “hip” as search words (as last accessed on June 17, 2017) did not produce curricula focusing on common nonurgent musculoskeletal disorders of lumbar spine and hip.

In designing a workshop focusing on musculoskeletal lumbar spine and hip disorders, we applied conceptual models of logical thinking and script theory.4,5 We also made several changes to the workshop structure based on our earlier experience and program evaluation, in line with the principles of design-based research.10 Firstly, we combined the multiple-choice tests for the two body areas. Whereas in the previous workshop we tested cervical spine and shoulder knowledge separately, we found that unnecessary and in the present workshop combined lumbosacral spine and hip testing. Secondly, we increased the time available for both the pre- and the posttest, since in the prior workshop a number of residents were not able to finish all of the questions on time. Thirdly, in order to activate residents’ prior knowledge in this domain, we asked them to review a lumbosacral spine and hip script-sorting table (Appendix A) prior to participating in the current workshop.11 Finally, we created a musculoskeletal physical examination refresher guide (Appendix F) to broaden applicability of this workshop to residents less familiar with some of the maneuvers.

The unique contribution of this workshop is that it utilizes both deductive and inductive clinical reasoning, offers learners practice in toggling from one to another (a skill that is necessary for real-world diagnostic reasoning), and at the same time applies this broad skill to a different and concrete domain of medical knowledge. Note that the first learning objective is linked to the deductive form of clinical reasoning, while the second learning objective is linked to the inductive form of clinical reasoning.

Methods

Residents enter our advanced physical medicine and rehabilitation residency program after completing a postgraduate year focusing on basic clinical skills. There are 12 residents in each of the 3 years. Residency curriculum consists, broadly, of workplace-based learning12 during clinical rotations and weekly core curriculum delivered by faculty when all residents are physically in a single location. The core curriculum includes traditional classroom lectures and various interactive learning formats such as case-based learning, hands-on workshops, panel discussions, etc. At the beginning of the year, residents join one of three smaller learning groups that are used for all small-group learning activities such as case-based learning.

This workshop was given once midyear; the assessment data is from that one iteration. First- through third-year physical medicine and rehabilitation residents who participated had approximately 6 months, 18 months, and 30 months of the didactic core curriculum and clinical rotations, respectively. Musculoskeletal and sports medicine is covered over approximately a third of the curricula and rotations.

This workshop was conducted during the core curriculum time. Four classrooms were utilized, a large classroom for the entire class and three small-group rooms. A single faculty member conducted all segments of the workshop.

Introduction and Learning Objectives
Prior to the start of the workshop, residents were provided with and asked to review a lumbosacral spine and hip script-sorting table (Appendix A) and a refresher guide (Appendix F). The workshop began with a faculty PowerPoint presentation (Appendix B). After a brief review of the process of clinical reasoning and dual reasoning theory, faculty reviewed the role of semantic qualifiers13 and the workshop structure of toggling between recognizing the diagnosis from an exemplar case vignette and identifying key features14 given a common diagnosis. The presentation also covered the learning objectives as well as the workshop schedule.

Pretest
During the next segment, the entire class took an online pretest (Appendix C) utilizing our institutional learning management system. Pretest questions were multiple-choice select-response type15 and were written by the author using 17 years of clinical and teaching experience in the musculoskeletal content domain. The question stems can be thought of as examples of illness scripts, which are concise descriptions of the key characteristics of specific medical conditions. The diagnoses were selected based on faculty listings of typical and common outpatient problems seen by physical medicine and rehabilitation residents over the course of their teaching career. We hypothesized that when the residents were reading each pretest case vignette, they reflected the details of each case against the illness scripts for all of the conditions listed, searching for the best match (deductive reasoning). Twenty minutes were allocated for the test, based on lessons learned from the previous workshop. Appendix C lists the test questions and answer options in a Microsoft Word document format that can be imported into a local learning management system, or administered on paper.

During the next two segments (small-group preparation and game time), residents focused on identifying key clinical features from the case vignettes that helped them discriminate between two or more competing diagnoses (inductive reasoning).

Small-Group Preparation
After the pretest, residents joined their small-learning groups in separate, smaller classrooms. Residents were provided with the list of 12 diagnoses (Appendix D), and senior (second and third-year) residents in each group were instructed to train the first-year residents and prepare them for the team competition in the next segment. Residents were encouraged to use any resources and references they needed to accomplish that task.

Game Time/Team Competition
All residents returned to the larger classroom for the next segment of the workshop. First-year residents from each learning group sat in the front row and participated in the competition. Second and third-year residents were encouraged to cheer for their group but were not allowed to help with answers or hints.

The faculty facilitated the competition. Each small learning group took turns to play, and received one point per correct answer. There was no penalty for wrong answers; however, each group was allowed only one answer. There were 12 diagnoses, and each of the three groups had four turns.

After specifying which group was playing, the faculty showed one of the exemplar diagnoses (Appendix D) on the screen. The competitors were asked to provide at least two key features from history and at least two key features from physical examination. Competitors in the group could discuss their answer, and there was no time restriction. After the group provided their answer, faculty showed the case vignette with key features highlighted in red (Appendix D), and engaged the class in a brief interactive discussion to both ensure understanding and share clinical examples. This process was repeated for each diagnosis with three competing groups taking turns. One of the chief residents kept the score, added up the points, and announced the winning group and the runner-up at the end of this segment. We hypothesized that residents used the details gathered during the history and physical and worked to identify key clinical features that had the most discriminating power to distinguish between two or more diagnoses under consideration. This could be thought of as an example of working from observations toward an explanatory theory (inductive reasoning).

Posttest
In the next segment, the entire class took an online posttest (Appendix C) utilizing the institutional learning management system. Posttest questions and answer options were identical to pretest questions and answer options. As with the pretest, twenty minutes were allocated for the test, which may be imported into a local learning management system or administered in a paper and pencil format.

Session Evaluation
In this segment, the entire class completed an online session evaluation (Appendix E) consisting of three construct-response questions15 addressing Kirkpatrick’s levels 1 and 2.16 Fifteen minutes were allocated for the session evaluation.

Data Analysis
Data analysis was performed using the Brightspace learning management system by D2L. Only data from residents who completed the entire test were examined.

We did not think that overall test reliability would be meaningful in a test with a small number of items and relatively small number of participants. Instead, we used individual items’ discrimination index and point biserial coefficient to evaluate the quality of the individual test questions. The discrimination index indicates how well a question differentiates between high and low performers. It can range from -100% to 100%, with high values indicating a “good” question and low values indicating a “bad” question.15 Similarly to the discrimination index, the point biserial correlation coefficient relates individuals’ quiz scores to whether or not they got a question correct. It ranges from -1.00 to 1.00, with high values indicating a “good” question and low values indicating a “bad” question.15 Another factor in selecting individual item analysis was the fact that our learning management system calculated point biserial and discrimination index as part of the test output. Additionally, we calculated pre- and posttest means and standard deviations for the class as a whole.

Results

Twenty-nine residents participated in the workshop. Within the allocated time of 20 minutes, 29 residents were able to complete the pretest and 28 were able to complete the posttest.

Pretest questions were of reasonable quality for a 12-item test, with a discrimination index average of 12.5% and a point biserial average of 0.6. The pretest class average was 96.6%, with a standard deviation of 6.1. The posttest class average was 99.7%, with a standard deviation of 1.6.

Nine residents completed an online session evaluation. A majority of residents (85.7%) indicated that they either “liked” the workshop or “liked it a lot.” Most (82.9%) felt they were better or much better at being able to recognize common neck and shoulder problems as a result of the workshop. Participants provided a number of narrative comments and recommendations:

  • “Include treatment as well as history and physical exam findings.”
  • “Incorporate management.”
  • “Make it more level specific. Second-year residents recognize SSX, third-year residents recognize diagnosis/imaging/classic tests, and fourth-year residents teach management.”
  • “Perhaps we can demonstrate special tests (ones that we did not demonstrate) including proper hand placement and techniques as well as sensitivity and specificity.”
  • “Forcing third and fourth-years to participate in answering questions publicly might make everyone participate more than just the second-years.”
  • “More stimulation for fourth-year residents.”
  • “Probably review of management and treatment protocols.”
  • “It was great. I would just add a little bit regarding management of discussed diagnoses.”
  • “Questions could be harder, add management to quiz.”
  • “I love the format, but there needs to be third and fourth-year level questions too.”
  • “It’s good to go over the basics but it’s also good to get deeper to involve the upper classmen.”

Discussion

As clinician-teachers, we cannot simply transfer our customized collections of illness scripts into the minds of residents. Thus, this workshop was developed to help them develop and fine-tune their own sets of scripts. It is also probably safe to assume that learning to utilize both deductive and inductive reasoning is a helpful skill to add to the residents’ clinical reasoning tool box. This brief workshop was designed to provide resident physicians with practice in engaging and toggling between both modes of information processing using exemplar case vignettes.

In creating the workshop, we have decided to incorporate most of the features of our previously described program.9 However, we made four important changes. The first was to combine the multiple-choice tests into one pretest and one posttest. Next, we increased the time available for both tests. Then, we provided residents with a lumbosacral spine and hip script-sorting table prior to the workshop. Finally, we included a musculoskeletal examination refresher resource. Arguably, an additional “silent” change was the time that passed between the cervical spine/shoulder and lumbosacral spine/hip workshops (a period of 4 months), during which residents continued to care for patients with musculoskeletal conditions while reading and practicing multiple-choice questions.

The fact that the pretest scores were so high could be interpreted in one of several ways. It may be that the residents’ knowledge of the common lumbosacral and hip conditions is exceptionally high, higher than it was for the cervical spine and shoulder disorders domain.9 Alternatively, it may be that the test questions were easier than in the previous workshop, although the point biserial of 0.6 does not support this hypothesis. Yet another possibility is that residents’ applied knowledge in the musculoskeletal domain has grown in the several months between two workshops. Finally, having the residents review the lumbosacral spine and hip script-sorting table in advance of the workshop may have increased their ability to answer the pretest questions correctly. This of course was our intent, however we did not track residents’ actual compliance with doing their preworkshop homework.

The design of this educational intervention as well as the assessment is still imperfect. It is “philosophically crowded” with elements of a number of educational models such as team-based learning,17 case-based learning,18 key features assessment,14 script theory,5 and gamification.19 The educational intervention itself is unstructured and is dependent on each group finding their way through the task successfully. The reliability of the assessment is limited by a small number of items in each test. Additionally, the pre- and posttest questions are identical and administered within 2 hours of each other, raising the possibility of participants remembering the answers. This is a weakness that future users may want to address by modifying the posttest questions. Another weakness is that both the case vignettes and the questions were written by a single faculty member, and may have incorporated biases intrinsic to their individual experience. Finally, development and implementation were not resource-free. Design and development required approximately 3 hours of the faculty time, most of it spent in writing cases and questions and uploading them in the learning management system. Implementation required 3 hours of single faculty time and four classrooms to accommodate the group of 29 residents.

Educators should be clear regarding the expectations for the two overarching groups of learners who might use these materials. Trainees in the musculoskeletal specialties, such as orthopedic surgery, rheumatology, physical medicine and rehabilitation, sports medicine, etc., will be able to use the workshop with a single focus of enhancing their clinical reasoning skills. Trainees in general adult medicine, on the other hand, may need to concurrently review and practice musculoskeletal physical examination maneuvers outlined in the included glossary.

At the same time, the workshop has some merits that warrant consideration. It is relevant to residents in several specialties, such as physical medicine and rehabilitation, rheumatology, orthopedic surgery, family medicine, and internal medicine. It is also relevant to senior medical students, physician assistants, and nurse practitioners. It is brief and requires only a single faculty member to implement. It was again liked by the residents, produced an improvement in test scores, and resulted in prolonged learning engagement of most of the residents. It was designed based on principles of evidence-based medical education and may enhance development of a community of learners within a residency program.20

Several lessons were learned in the process of design and implementation. Firstly, the time limit of 20 minutes was more appropriate than the previous limit of 10 minutes, as all residents but one were able to finish the entire test in time. Secondly, significant faculty time was necessary to implement the first workshop using the institutional learning management system; specifically, entering individual pre and posttest questions was laborious and tedious. This time was reduced by 5 hours, likely due to the faculty’s increased facility with the learning management system and efficiency in creating questions. Finally, residents continued to like this interactive and structured workshop, although this time they felt the content should be explored in greater depth.


Author Information

  • Alex Moroz, MD: Associate Professor, Department of Rehabilitation Medicine, New York University School of Medicine

Disclosures
None to report.

Funding/Support
None to report.

Ethical Approval
Reported as not applicable.


References

  1. Mamede S, van Gog T, Moura AS, et al. Reflection as a strategy to foster medical students’ acquisition of diagnostic competence. Med Educ. 2012;46(5):464-472. https://doi.org/10.1111/j.1365-2923.2012.04217.x
  2. Schmidt HG, Mamede S, van den Berge K, van Gog T, van Saase JL, Rikers RM. Exposure to media information about a disease can cause doctors to misdiagnose similar-looking clinical cases. Acad Med. 2014;89(2):285-291. https://doi.org/10.1097/ACM.0000000000000107
  3. Sibbald M, de Bruin ABH. Feasibility of self-reflection as a tool to balance clinical reasoning strategies. Adv Health Sci Educ Theory Pract. 2012;17(3):419-429. https://doi.org/10.1007/s10459-011-9320-5
  4. Kyriacou DN. Evidence-based medical decision making: deductive versus inductive logical thinking. Acad Emerg Med. 2004;11(6):670-671. https://doi.org/10.1197/j.aem.2004.02.512
  5. Lubarsky S, Dory V, Audétat MC, Custers E, Charlin B. Using script theory to cultivate illness script formation and clinical reasoning in health professions education. Can Med Educ J. 2015;6(2):e61-e70.
  6. Weinstein A, Pinto-Powell R. Introductory clinical reasoning curriculum. MedEdPORTAL Publications. 2016;12:10370. https://doi.org/10.15766/mep_2374-8265.10370
  7. Norman GR, Monteiro SD, Sherbino J, Ilgen JS, Schmidt HG, Mamede S. The causes of errors in clinical reasoning: cognitive biases, knowledge deficits, and dual process thinking. Acad Med. 2016;92(1):23-30. https://doi.org/10.1097/ACM.0000000000001421
  8. Soares S, Wang H, Siddharthan T, Holt S. OSCE-based teaching of the musculoskeletal exam to internal medicine residents and medical students: neck and spine. MedEdPORTAL Publications. 2015;11:10120. https://doi.org/10.15766/mep_2374-8265.10120
  9. Moroz A. Clinical reasoning workshop: cervical spine and shoulder disorders. MedEdPORTAL Publications. 2017;13:10560. https://doi.org/10.15766/mep_2374-8265.10560
  10. Dolmans DHJM, Tigelaar D. Building bridges between theory and practice in medical education using a design-based research approach: AMEE Guide No. 60. Med Teach. 2012;34(1):1-10. https://doi.org/10.3109/0142159X.2011.595437
  11. Levin M, Cennimo D, Chen S, Lamba S. Teaching clinical reasoning to medical students: a case-based illness script worksheet approach. MedEdPORTAL Publications. 2016;12:10445. https://doi.org/10.15766/mep_2374-8265.10445
  12. Yardley S, Teunissen PW, Dornan T. Experiential learning: AMEE Guide No. 63. Med Teach. 2012;34(2):e102-e115. https://doi.org/10.3109/0142159X.2012.650741
  13. Bordage G, Lemieux M. Semantic structures and diagnostic thinking of experts and novices. Acad Med. 1991;66(9)(suppl):S70-S72. https://doi.org/10.1097/00001888-199109000-00045
  14. Hrynchak P, Takahashi SG, Nayer M. Key-feature questions for assessment of clinical reasoning: a literature review. Med Educ. 2014;48(9):870-883. https://doi.org/10.1111/medu.12509
  15. Downing SM, Yudkowski R, eds. Assessment in Health Professions Education. New York, NY: Routledge; 2009.
  16. Kirkpatrick DL, Kirkpatrick JD. Evaluating Training Programs: The Four Levels. 3rd ed. San Francisco, CA: Berrett-Koehler Publishers; 2006.
  17. Fatmi M, Hartling L, Hillier T, Campbell S, Oswald AE. The effectiveness of team-based learning on learning outcomes in health professions education: BEME Guide No. 30. Med Teach. 2013;35(12):e1608-e1624. https://doi.org/10.3109/0142159X.2013.849802
  18. Thistlethwaite JE, Davies D, Ekeocha S, et al. The effectiveness of case-based learning in health professional education. a BEME systematic review: BEME Guide No. 23. Med Teach. 2012;34(6):e421-e444. https://doi.org/10.3109/0142159X.2012.680939
  19. Hamari J, Koivisto J, Sarsa H. Does gamification work?—A literature review of empirical studies on gamification. Paper presented at the 47th Hawaii International Conference on System Sciences; January 6-9, 2014; Waikoloa, HI. https://doi.org/10.1109/HICSS.2014.377
  20. Rogoff B. Developing understanding of the idea of communities of learners. Mind, Cult, Activity. 1994;1(4):209-229.


Citation

Moroz A. Clinical reasoning workshop: lumbosacral spine and hip disorders. MedEdPORTAL. 2017;13:10632. https://doi.org/10.15766/mep_2374-8265.10632

Received: June 5, 2017

Accepted: September 1, 2017