A developmental assessment of clinical reasoning in preclinical medical education

Alice A. Min Simpkins, Bryna Koch, Karen Spear-Ellinwood, Paul A St John

Research output: Contribution to journalArticle

Abstract

Background: Clinical reasoning is an essential skill to be learned during medical education. A developmental framework for the assessment and measurement of this skill has not yet been described in the literature. Objective: The authors describe the creation and pilot implementation of a rubric designed to assess the development of clinical reasoning skills in pre-clinical medical education. Design: The multi-disciplinary course team used Backwards Design to develop course goals, objectives, and assessment for a new Clinical Reasoning Course. The team focused on behaviors that students were expected to demonstrate, identifying each as a ‘desired result’ element and aligning these with three levels of performance: emerging, acquiring, and mastering. Results: The first draft of the rubric was reviewed and piloted by faculty using sample student entries; this provided feedback on ease of use and appropriateness. After the first semester, the course team evaluated whether the rubric distinguished between different levels of student performance in each competency. A systematic approach based on descriptive analysis of mid- and end of semester assessments of student performance revealed that from mid- to end-of-semester, over half the students received higher competency scores at semester end. Conclusion: The assessment rubric allowed students in the early stages of clinical reasoning development to understand their trajectory and provided faculty a framework from which to give meaningful feedback. The multi-disciplinary background of the course team supported a systematic and robust course and assessment design process. The authors strongly encourage other colleges to support the use of collaborative and multi-disciplinary course teams.

Original languageEnglish (US)
Article number1591257
JournalMedical Education Online
Volume24
Issue number1
DOIs
StatePublished - Jan 1 2019

Fingerprint

semester
education
student
performance

Keywords

  • assessment
  • Clinical reasoning
  • medical education

ASJC Scopus subject areas

  • Education

Cite this

A developmental assessment of clinical reasoning in preclinical medical education. / Min Simpkins, Alice A.; Koch, Bryna; Spear-Ellinwood, Karen; St John, Paul A.

In: Medical Education Online, Vol. 24, No. 1, 1591257, 01.01.2019.

Research output: Contribution to journalArticle

Min Simpkins, Alice A. ; Koch, Bryna ; Spear-Ellinwood, Karen ; St John, Paul A. / A developmental assessment of clinical reasoning in preclinical medical education. In: Medical Education Online. 2019 ; Vol. 24, No. 1.
@article{064a1a46b28644e79df64151af7436f9,
title = "A developmental assessment of clinical reasoning in preclinical medical education",
abstract = "Background: Clinical reasoning is an essential skill to be learned during medical education. A developmental framework for the assessment and measurement of this skill has not yet been described in the literature. Objective: The authors describe the creation and pilot implementation of a rubric designed to assess the development of clinical reasoning skills in pre-clinical medical education. Design: The multi-disciplinary course team used Backwards Design to develop course goals, objectives, and assessment for a new Clinical Reasoning Course. The team focused on behaviors that students were expected to demonstrate, identifying each as a ‘desired result’ element and aligning these with three levels of performance: emerging, acquiring, and mastering. Results: The first draft of the rubric was reviewed and piloted by faculty using sample student entries; this provided feedback on ease of use and appropriateness. After the first semester, the course team evaluated whether the rubric distinguished between different levels of student performance in each competency. A systematic approach based on descriptive analysis of mid- and end of semester assessments of student performance revealed that from mid- to end-of-semester, over half the students received higher competency scores at semester end. Conclusion: The assessment rubric allowed students in the early stages of clinical reasoning development to understand their trajectory and provided faculty a framework from which to give meaningful feedback. The multi-disciplinary background of the course team supported a systematic and robust course and assessment design process. The authors strongly encourage other colleges to support the use of collaborative and multi-disciplinary course teams.",
keywords = "assessment, Clinical reasoning, medical education",
author = "{Min Simpkins}, {Alice A.} and Bryna Koch and Karen Spear-Ellinwood and {St John}, {Paul A}",
year = "2019",
month = "1",
day = "1",
doi = "10.1080/10872981.2019.1591257",
language = "English (US)",
volume = "24",
journal = "Medical Education Online",
issn = "1087-2981",
publisher = "Co-Action Publishing",
number = "1",

}

TY - JOUR

T1 - A developmental assessment of clinical reasoning in preclinical medical education

AU - Min Simpkins, Alice A.

AU - Koch, Bryna

AU - Spear-Ellinwood, Karen

AU - St John, Paul A

PY - 2019/1/1

Y1 - 2019/1/1

N2 - Background: Clinical reasoning is an essential skill to be learned during medical education. A developmental framework for the assessment and measurement of this skill has not yet been described in the literature. Objective: The authors describe the creation and pilot implementation of a rubric designed to assess the development of clinical reasoning skills in pre-clinical medical education. Design: The multi-disciplinary course team used Backwards Design to develop course goals, objectives, and assessment for a new Clinical Reasoning Course. The team focused on behaviors that students were expected to demonstrate, identifying each as a ‘desired result’ element and aligning these with three levels of performance: emerging, acquiring, and mastering. Results: The first draft of the rubric was reviewed and piloted by faculty using sample student entries; this provided feedback on ease of use and appropriateness. After the first semester, the course team evaluated whether the rubric distinguished between different levels of student performance in each competency. A systematic approach based on descriptive analysis of mid- and end of semester assessments of student performance revealed that from mid- to end-of-semester, over half the students received higher competency scores at semester end. Conclusion: The assessment rubric allowed students in the early stages of clinical reasoning development to understand their trajectory and provided faculty a framework from which to give meaningful feedback. The multi-disciplinary background of the course team supported a systematic and robust course and assessment design process. The authors strongly encourage other colleges to support the use of collaborative and multi-disciplinary course teams.

AB - Background: Clinical reasoning is an essential skill to be learned during medical education. A developmental framework for the assessment and measurement of this skill has not yet been described in the literature. Objective: The authors describe the creation and pilot implementation of a rubric designed to assess the development of clinical reasoning skills in pre-clinical medical education. Design: The multi-disciplinary course team used Backwards Design to develop course goals, objectives, and assessment for a new Clinical Reasoning Course. The team focused on behaviors that students were expected to demonstrate, identifying each as a ‘desired result’ element and aligning these with three levels of performance: emerging, acquiring, and mastering. Results: The first draft of the rubric was reviewed and piloted by faculty using sample student entries; this provided feedback on ease of use and appropriateness. After the first semester, the course team evaluated whether the rubric distinguished between different levels of student performance in each competency. A systematic approach based on descriptive analysis of mid- and end of semester assessments of student performance revealed that from mid- to end-of-semester, over half the students received higher competency scores at semester end. Conclusion: The assessment rubric allowed students in the early stages of clinical reasoning development to understand their trajectory and provided faculty a framework from which to give meaningful feedback. The multi-disciplinary background of the course team supported a systematic and robust course and assessment design process. The authors strongly encourage other colleges to support the use of collaborative and multi-disciplinary course teams.

KW - assessment

KW - Clinical reasoning

KW - medical education

UR - http://www.scopus.com/inward/record.url?scp=85063723936&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85063723936&partnerID=8YFLogxK

U2 - 10.1080/10872981.2019.1591257

DO - 10.1080/10872981.2019.1591257

M3 - Article

C2 - 30935299

AN - SCOPUS:85063723936

VL - 24

JO - Medical Education Online

JF - Medical Education Online

SN - 1087-2981

IS - 1

M1 - 1591257

ER -