Background: Educators need efficient and effective means to track students' clinical experiences to monitor their progress toward competency goals.
Aim: To validate an electronic scoring system that rates medical students' clinical notes for relevance to priority topics of the medical school curriculum.
Method: The Vanderbilt School of Medicine Core Clinical Curriculum enumerates 25 core clinical problems (CCP) that graduating medical students must understand. Medical students upload clinical notes pertinent to each CCP to a web-based dashboard, but criteria for determining relevance of a note and consistent uploading practices by students are lacking. The Vanderbilt Learning Portfolio (VLP) system automates both tasks by rating relevance for each CCP and uploading the note to the student's electronic dashboard. We validated this electronic scoring system by comparing the relevance of 265 clinical notes written by third year medical students to each of the 25 core patient problems as scored by VLP verses an expert panel of raters.
Results: We established the threshold score which yielded 75% positive prediction of relevance for 16 of the 25 clinical problems to expert opinion.
Discussion: Automated scoring of student's clinical notes provides a novel, efficient and standardized means of tracking student's progress toward institutional competency goals.