Nothing Special   »   [go: up one dir, main page]

Academia.eduAcademia.edu

Dental Students' Clinical Experience Across Three Successive Curricula at One U.S. Dental School

2017, Journal of dental education

As dental schools continue to seek the most effective ways to provide clinical education for students, it is important to track the effects innovations have on students' clinical experience to allow for quantitative comparisons of various curricula. The aim of this study was to compare the impact of three successive clinical curricula on students' experience at one U.S. dental school. The three were a discipline-based curriculum (DBC), a comprehensive care curriculum (CCC), and a procedural requirement curriculum plus externships (PRCE). Students' clinic experience data from 1992 to 2013 were analyzed for total experience and in five discipline areas. Clinic experience metrics analyzed were patient visits (PVs), relative value units (RVUs), and equivalent amounts (EQAs). A minimum experience threshold (MET) and a high experience threshold (HET) were set at one standard deviation above and below the mean for the DBC years. Students below the MET were designated as low ach...

UCSF UC San Francisco Previously Published Works Title Dental Students' Clinical Experience Across Three Successive Curricula at One U.S. Dental School. Permalink https://escholarship.org/uc/item/31k8n4x4 Journal Journal of dental education, 81(4) ISSN 0022-0337 Authors White, Joel M Jenson, Larry E Gansky, Stuart A et al. Publication Date 2017-04-01 DOI 10.21815/jde.016.010 Peer reviewed eScholarship.org Powered by the California Digital Library University of California Predoctoral Dental Education Dental Students’ Clinical Experience Across Three Successive Curricula at One U.S. Dental School Joel M. White, DDS, MS; Larry E. Jenson, DDS, MA; Stuart A. Gansky, MS, DrPH; Cameron J. Walsh, BS; Brent T. Accurso, DDS, MPH; Ram M. Vaderhobli, BDS, MS; Elsbeth Kalenderian, DDS, MPH, PhD; Muhammad F. Walji, PhD; Jing Cheng, MD, MS, PhD Abstract: As dental schools continue to seek the most effective ways to provide clinical education for students, it is important to track the effects innovations have on students’ clinical experience to allow for quantitative comparisons of various curricula. The aim of this study was to compare the impact of three successive clinical curricula on students’ experience at one U.S. dental school. The three were a discipline-based curriculum (DBC), a comprehensive care curriculum (CCC), and a procedural requirement curriculum plus externships (PRCE). Students’ clinic experience data from 1992 to 2013 were analyzed for total experience and in five discipline areas. Clinic experience metrics analyzed were patient visits (PVs), relative value units (RVUs), and equivalent amounts (EQAs). A minimum experience threshold (MET) and a high experience threshold (HET) were set at one standard deviation above and below the mean for the DBC years. Students below the MET were designated as low achievers; students above the HET were designated as high achievers. The results showed significant differences among the three curricula in almost all areas of comparison: total PVs, total EQAs, total RVUs, RVUs by discipline, and number of high and low achievers in total clinical experience and by discipline. The comprehensive care approach to clinical education did not negatively impact students’ clinical experience and in many cases enhanced it. The addition of externships also enhanced student total clinical experience although more study is needed to determine their effectiveness. The insights provided by this study suggest that the methodology used including the metrics of PVs, EQAs, and RVUs may be helpful for other dental schools in assessing students’ clinical experience. Dr. White is Professor, Department of Preventive and Restorative Dentistry, University of California, San Francisco School of Dentistry; Dr. Jenson is former Health Sciences Clinical Professor, University of California, San Francisco School of Dentistry; Dr. Gansky is Professor and Lee Hysan Chair of Oral Epidemiology, Division of Oral Epidemiology and Dental Public Health, and Vice-Chair for Research, Department of Preventive and Restorative Dentistry, University of California, San Francisco School of Dentistry; Mr. Walsh is a fourth-year dental student, University of California, San Francisco School of Dentistry; Dr. Accurso is with Oral Pathology Consultants, Grosse Pointe Woods, MI; Dr. Vaderhobli is Associate Professor, Department of Preventive and Restorative Dentistry, University of California, San Francisco School of Dentistry; Dr. Kalenderian is Professor and Gladys and Leland Barber Chair of Department of Preventive and Restorative Dentistry, University of California, San Francisco School of Dentistry; Dr. Walji is Professor and Associate Dean for Technology Services and Informatics, University of Texas School of Dentistry at Houston; and Dr. Cheng is Professor, Department of Preventive and Restorative Dentistry, University of California, San Francisco School of Dentistry. Direct correspondence to Dr. Joel White, University of California, San Francisco School of Dentistry, 707 Parnassus Avenue, Box 0758, San Francisco, CA 94143-0758; 415-476-0918; whitej@dentistry.ucsf.edu. Keywords: dental education, clinical curriculum, clinical education, clinical skills, clinical competence, instructional models, teaching methods, comprehensive care, externships Submitted for publication 3/30/16; accepted 9/8/16 doi: 10.21815/JDE.016.010 L ike many U.S. dental schools over the last 25 years, the University of California, San Francisco (UCSF) School of Dentistry has employed various clinical curricula to ensure students obtain adequate clinical experience and attain required competencies for graduation. Dental schools have introduced such innovative curriculum and assessment formats as comprehensive care,1-6 366 case completion,7,8 competency examinations,9-12 portfolios,13,14 and community-based, off-campus externships.15-17 UCSF has had three major clinical curricular changes since 1992, moving from a discipline-based curriculum to a comprehensive care curriculum to a procedural requirement curriculum plus externships. The aim of this study was to compare the impact of those three curricula on students’ Journal of Dental Education ■ Volume 81, Number 4 clinical experience from 1992 to 2013. We wanted to know each graduating class’s overall clinical experience and the variations resulting from the different curricula. In this study, clinical curricula included all third- and fourth-year patient care courses, course requirements, and the manner and environment in which requirements were achieved. Clinical Curricula at UCSF For much of their histories, most U.S. dental schools utilized a similar clinical curriculum that set course requirements by numbers of individual procedures completed and required those procedures to be taught within specific clinical disciplines (general restorative dentistry, periodontics, removable prosthodontics, fixed prosthodontics, oral surgery, and endodontics) often with designated courses, clinics, and faculty for each discipline. Overall patient care was predominantly the responsibility of the school, although students were often expected to manage this care within the clinical structure. Academic philosophies were (and still are in some cases) based on the assumption that the more repetitions completed by a student in a given procedure, the more likely the student was to be competent in that procedure upon graduation. This assumption has no published evidence to support it and, in fact, has published evidence that challenges it.18-21 Schools also assumed, again without published evidence, that discipline specialists were the best faculty members to oversee the procedures in their discipline. We refer to this type of clinical curriculum and educational philosophy as a procedural requirements curriculum (PRC). The relative value unit (RVU) concept, developed in 1984, was a precursor of UCSF’s curriculum change. Using RVU calculations, as defined by UCSF, course directors could set requirements on students’ total discipline-specific experience; e.g., instead of stipulating ten one-surface amalgams and 12 two-surface composites to pass the course, requirements could be based on the total number of restorative procedures regardless of the number of surfaces or restorative material. In this disciplinebased curriculum (DBC), clinical specialty faculty members generally oversaw specific procedures, and separate specialty-based clinical courses constituted the curriculum. The PRC’s negative effects on learning and care motivated UCSF’s move to a DBC. Evaluating within-discipline student clinical experience instead of individual procedures mitigated patient April 2017 ■ Journal of Dental Education “horse-trading” among students in their desperation to find the “right” patient in order to graduate. This behavior, though understandable, often created poor overall patient care and insufficient student competence in overall patient care management skills. We had four years of DBC data (1992-96) as a baseline for comparing later clinical curriculum changes. Although the DBC appeared to avoid the problems seen with the PRC, administrators and course directors remained concerned with negative effects of this curriculum. In the DBC, students still focused on procedures, and overall patient care was often neglected. Educationally, teaching students to attend to overall and continuing patient care was problematic. Ethically, patient management was less justifiable. Under the DBC, students deemphasized crucial patient care procedures such as periodic oral exams, dental hygiene visits, and prevention visits. Also, administrators noticed that clinic attendance would often drop after students had met the course requirements, thus limiting their clinical experiences. These concerns coalesced into a clinical curriculum focusing on more comprehensively caring for patients and educating students. UCSF was in the vanguard of schools in this comprehensive care movement in dental education of the 1990s. UCSF’s second major clinical curricular change, introduced in 1996, was thus a move to a comprehensive care curriculum (CCC). The CCC entirely removed discipline-specific procedural requirements, giving students course credit for caring for patients first with the goal that each student would gain adequate experiential depth and breadth in every discipline by graduation. The change to a CCC was led by the dean and restorative department chair and was implemented by the comprehensive care course directors. Generalist faculty members could teach students procedures that only clinical specialists previously could, and students were encouraged and rewarded for providing overall patient care. This curriculum, which remained in effect at the school through 2004, sought to balance patient care and educational objectives. Student clinical experience was monitored carefully every year. As there was much initial resistance from the faculty to the CCC change, a quantitative analysis of any differences in students’ clinical experience was clearly needed, so research into this question began. In 2005, three major changes occurred in UCSF’s clinical curriculum, prompted by a schoolwide curriculum reform process that included a transition from comprehensive care to what is called “patient-centered 367 care” as part of a “stream” in curriculum reform.22 First, CCC was de-emphasized, and clinical course directors reinstated individual procedure counting as in the PRC. Second, specialty courses enabled specialty faculty members to oversee procedures and set individual course requirements as in the PRC. Third, off-site clinical opportunities through externships were added to clinical rotations. These external rotations were in three consecutive week blocks, one in the third year and two in the fourth year, for a total of nine weeks. This curriculum, called the procedural requirement curriculum plus externships (PRCE), remains at UCSF today. Our data (1992-2013) span the transitions from the DBC to the CCC to the PRCE. With each major curriculum change, UCSF faculty members have been concerned with the impact on numbers of students’ specific clinical procedures by graduation. Since the traditional notion that procedure repetition yields clinical competence still persists among many faculty and institutions, we felt that it was important to investigate the impact of these curricular changes on students’ clinical experience. In particular, the faculty was generally skeptical of the comprehensive care approach to clinical training, and this study was initiated, in large part, to address these concerns. Though comparing such significant changes in educational philosophy is challenging, we believe that the three curricula presented here are distinct enough from each other to allow important inferences to be drawn about historical clinical pedagogical decisions. Focusing on students’ experience during their two-year clinical training permits drawing supportable conclusions about each curriculum’s impact using the metrics of patient visits, relative value units, and equivalent amounts. Methods Institutional Review Board approval was not required for this research as all data were deidentified according to UCSF guidelines. Data for this study were acquired from the school’s electronic health record for the combined third- and fourth-year clinical experience for each UCSF graduating class from 1992 through 2013. A total of 303 students in the DBC, 690 students in the CCC, and 711 students in the PRCE were included in the analyses. Three measures were selected to compare students’ clinical experience in the three clinical curricula: patient visits (PVs), equivalent amounts (EQAs), and relative value units (RVUs). 368 PVs reflect the number of patient encounters each student had in his or her two-year clinical experience. EQAs are the dollar equivalent of each clinic procedure completed. The cash fee for each procedure listed in the school’s fee schedule was used to standardize the dollar value irrespective of reimbursement rate due to a patient’s payer source, whether self-pay, insurance, government program, or other reimbursement mechanism. EQAs represent the dollar amount billed for the completed procedure as unadjusted production, applied consistently to all patient treatment. In addition, these amounts have been standardized to account for fee schedule changes at the school’s clinics and for the different fee schedules at off-site clinics over the years. They do not represent actual fees charged or collected by the university. Academically, student activities are captured as EQA dollars to standardize experience in dollars as a metric and eliminate differences in production due to reimbursement mechanisms (patient account type). RVUs can be determined by any number of factors including but not limited to complexity, material resources, knowledge, skill, effort, and time. In an educational setting, they are commonly used to represent a “point system” to determine a student’s progress towards fulfillment of clinical course requirements. For example, at UCSF, a cast restoration has an RVU of 10 whereas a two-surface amalgam has an RVU of 3; these values may be different at other schools. Lead faculty members at the time the RVU system at UCSF was developed calculated an RVU for each item on the fee schedule using their best judgment as to what sort of knowledge, skill, and effort each would demand of a novice practitioner in comparison to all other items on the fee schedule. As in all educational institutions, the impact of other factors had to be considered such as number of required steps in the procedure, amount of paperwork involved in a procedure, the likely amount of time waiting for an instructor check, etc. To make the system fair, students were to be rewarded for their clinical efforts and not punished for their luck in finding the right instructor or amount of paperwork required for a particular procedure. Each of the factors considered in determining an RVU has an impact on the time required for completion of the procedure relative to every other procedure on the fee schedule. One would expect that an experienced clinician with a high level of skill and fewer institutional roadblocks would require less time to complete a procedure than would a student Journal of Dental Education ■ Volume 81, Number 4 clinician. The values are relative but on a continuous ratio scale on which 0 has meaning (requiring no knowledge, skill, or effort) and every other value has some measure of each. For example, a procedure with an RVU of 4 on average would take twice the combined level of skill, knowledge, and effort of a procedure with an RVU of 2. This use of RVUs at UCSF is consistent with research on student experience reported by other dental schools.23-26 The RVUs for particular procedures at UCSF remained constant for the years involved in the study and have remained the same whether or not a procedure was completed at the school’s clinic or an extramural community-based location. As new clinical procedures have been added over the years, a corresponding RVU value has been assigned to each, based on the anticipated time for completion of the new procedure. Students receive a fee schedule of all procedures at the clinic that includes both the RVUs for the procedure and the amount of time each procedure should take them under ideal circumstances. This has been a valuable way by which students can self-monitor their efficiency in the clinic although efficiency (RVU per hour) was not a course requirement in any of the years studied. Each of the three clinical curricula assessed in the study marked a major shift in educational and patient care philosophy as evidenced in clinical course directors’ selection of minimum course requirements. PVs have not been used in any of the three curricula as a requirement but offer a commonly understood clinical metric. RVUs by discipline were used in the DBC curriculum to set minimum course requirements. Total EQAs and RVUs for each student were used during the CCC years to determine minimum course requirements. EQAs were used to set minimum course requirements in general dentistry courses but not the clinical specialty courses for the PRCE. Neither PV nor RVU totals were used to set minimum course requirements during the PRCE years although these data continued to be collected for all students. The three clinical curricula were compared using three metrics: mean total PVs, mean total EQAs, and mean total RVUs for each graduation class. All data extraction used 2013 specifications, allowing straightforward comparisons, without having to adjust for differences in fee schedules, definitions of RVUs, or definitions of PVs. Without data on students’ characteristics, we compared the three curricula marginally. For both EQAs and PVs, there were no individual-level student data in the DBC, so two-sample t-tests were used to compare April 2017 ■ Journal of Dental Education mean EQAs and mean PVs pairwise among the three curricula. The means were calculated by the class two-year experience total divided by the number of students in the class, reported by year of graduation. The standard deviation (SD) for students in the DBC was approximated with two methods compared as a sensitivity analysis. The first method assumed equal variance with the CCC SD for the DBC as the conservative-bound DBC had smaller EQAs and PVs than the CCC. The second method assumed that the DBC had a similar ratio of mean divided by SD as the CCC or the PRCE to compute the SD for the DBC. Similar results with both methods provided consistency. For RVU comparisons, we analyzed total RVUs, total on-campus RVUs, and total disciplinebased RVUs (restorative, removable prosthodontics, fixed periodontics, and endodontics) for each student in each curriculum. For students in the DBC and CCC, total oncampus RVUs for each student in each curriculum were the same as the total RVUs because students only had on-campus clinical experience. For students in the PRCE with nine weeks of externship, the total on-campus RVUs was computed as total RVUs minus total externship RVUs multiplied by an on-campus clinic weeks adjustment. The adjustment factor was computed as 1+(9/93) or 1.0968, for the nine weeks off-campus of 93 weeks on or off-campus during the third and fourth years of dental school. For students in the PRCE, we multiplied the total discipline-based RVUs with the adjustment factor to account for externship. We then compared the mean total RVUs, mean total on-campus RVUs, and mean total discipline-based RVUs among the three curricula (DBC, CCC, and PRCE) with analysis of variance (ANOVA). If there was significant overall difference across curricula, then pairwise comparisons were conducted. In addition, we wanted to know the impact that these clinical curricula changes had on individual student experience by discipline. Using the RVU mean in the DBC years, we developed a minimum experience threshold (MET) and a high experience threshold (HET) that was one standard deviation below and above the RVU means, respectively, for the four years of the DBC (1992-95). We took the DBC years as the baseline for our comparisons and computed the HET and MET as the DBC average plus and minus the DBC SD, respectively, and then we evaluated if each student achieved an amount ≥HET (designating them “high achievers”) or ≤MET (designating them “low achievers”) for total RVUs, 369 total on-campus RVUs, and total RVUs by discipline, respectively. We applied the MET and HET comparisons to specific clinical discipline areas (general restorative, removable prosthodontics, fixed prosthodontics, periodontics, and endodontics) as well as total clinic experienced as expressed in RVUs, EQAs, and PVs. We excluded oral surgery as metrics were not available for that discipline. Also in our analysis, clinical experience obtained at off-site externships was compared to clinical experience at the school’s clinics. This analysis could help us understand the impact externships have had on intramural (on-campus) clinic experience following these rotations. The percentages of high achievers and low achievers were computed and compared with chi-squared tests across the three curricula. We analyzed the data for the three curricula at the UCSF dental clinic and externship sites from 1992 to 2013: the DBC from 1992 to 1995, the CCC from 1996 to 2004, and the PRCE from 2005 to 2013. All the analyses were conducted in SAS 9.4 (SAS Institute Inc., Cary, NC, USA). We adopted a Bonferroni adjusted significance value of 0.0025 or less to account for multiple comparisons. Results The results of our analysis showed significant differences among the three curricula studied for all variables with the exception of fixed prosthodontics (Table 1). With externship data included, total mean student clinical experience as measured by PVs, RVUs, and EQAs was much higher in the PRCE. With externship data excluded and on-campus totals adjusted for time spent away at externships, the PRCE produced significantly less clinical experience, as measured by PVs, RVUs, and EQAs than both the DBC and CCC. In terms of PVs, the PRCE had significantly fewer than the CCC but was not significantly different from the DBC. In terms of RVUs by discipline, the PRCE had significantly Table 1. Comparative results of three curricula: procedural requirements curriculum plus externship (PRCE), comprehensive care curriculum (CCC), and discipline-based curriculum (DBC) Variable Including Externship Excluding Externship (On-Campus) Patient visits PRCE > CCC > DBC CCC > PRCE ≈ DBC Equivalent amounts PRCE > CCC > DBC CCC > DBC > PRCE Relative value units (RVUs) RVUs: restorative RVUs: removable prosthodontics RVUs: fixed prosthodontics RVUs: periodontics RVUs: endodontics PRCE > CCC > DBC NA NA NA NA NA DBC ≈ CCC > PRCE CCC > PRCE > DBC DBC > CCC > PRCE DBC ≈ CCC ≈ PRCE CCC > DBC > PRCE CCC > DBC > PRCE High achievers: total High achievers: restorative High achievers: removable prosthodontics High achievers: fixed prosthodontics High achievers: periodontics High achievers: endodontics PRCE > CCC > DBC NA NA NA NA NA CCC > DBC ≈ PRCE CCC > PRCE > DBC DBC > CCC > PRCE CCC ≈ DBC ≈ PRCE CCC > DBC > PRCE CCC > DBC > PRCE Low achievers: total Low achievers: restorative Low achievers: removable prosthodontics Low achievers: fixed prosthodontics Low achievers: periodontics Low achievers: endodontics PRCE < CCC ≈ DBC NA NA NA NA NA CCC ≈ DBC > PRCE CCC < PRCE ≈ DBC DBC < CCC < PRCE NP DBC ≈ CCC < PRCE DBC < CCC < PRCE Notes: Symbols > (greater than) and < (less than) indicate statistically significant differences (p≤0.05). Symbol ≈ indicates a difference that is not statistically significant (p>0.05). NA=data not available for RVUs or comparison of high and low achievers by discipline with externship NP=because the DBC fixed prosthodontics data had such high variability, the mean minus one SD was less than zero, so only students with zero fixed prosthodontics RVUs were considered low achievers; thus, statistical comparisons were not possible because there was no variability for this low achiever group (they were all zero RVUs). 370 Journal of Dental Education ■ Volume 81, Number 4 higher totals in restorative dentistry than the DBC but lower totals than the CCC. In all other disciplines, the PRCE had significantly lower totals than both the DBC and CCC. In terms of the number of high achievers in total clinical experience, the PRCE was significantly higher than both the DBC and CCC when externship data were included. Without externship data, the CCC was significantly higher than both the DBC and PRCE. By discipline, the high achievers’ totals showed that the CCC was significantly higher than both the DBC and PRCE in all areas except removable prosthodontics. In that discipline, the DBC was significantly higher than the CCC, and the CCC was significantly higher than the PRCE. In terms of the number of low achievers in total clinical experience, the PRCE was significantly lower than both the DBC and CCC when externship data were included. Without externship data, both the DBC and CCC were lower than the PRCE. By discipline, low achievers’ totals showed that the PRCE was significantly higher than both the DBC and CCC except in restorative dentistry where it was not significantly different from the DBC but higher than the CCC. Because the DBC fixed prosthodontics data had such high variability, the mean minus one SD was less than zero, so only students with zero fixed prosthodontics RVUs were considered low achievers; thus, statistical comparisons were not possible because there was no variability for this low achiever group (they were all zero RVUs). Figure 1 shows the mean patient visits (PVs) per student, per graduating class, by curriculum from all sources including and excluding externships. There were significant differences among all three curricula when externships were included, and there were significant differences between two curricula (CCC and PRCE) when externships were excluded. All differences are summarized in Table 1. Regarding the mean EQAs per student, per graduating class, by curriculum from all sources, there were significant differences among all three curricula when externships were both included and excluded (Figure 2). Regarding mean RVUs per student, per graduating class, by curriculum from all sources, there were significant differences among all three curricula when externships were included and significant differences between two curricula when externships were excluded (Figure 3). Regarding mean RVUs by discipline by curriculum, there were significant differences among all three curricula in all disciplines except fixed prosthodontics where the sample size and variance were too small for comparison (Figure 4). Figure 1. Mean patient visits per student per year, by curriculum DBC=discipline-based curriculum, CCC=comprehensive care curriculum, PRC=procedural requirements curriculum, E=externship April 2017 ■ Journal of Dental Education 371 Figure 2. Mean equivalent amount per student per year, by curriculum DBC=discipline-based curriculum, CCC=comprehensive care curriculum, PRC=procedural requirements curriculum, E=externship Figure 3. Mean relative value units per student per year, by curriculum DBC=discipline-based curriculum, CCC=comprehensive care curriculum, PRC=procedural requirements curriculum, E=externship 372 Journal of Dental Education ■ Volume 81, Number 4 Figure 4. Mean relative value units by discipline, by curriculum DBC=discipline-based curriculum, CCC=comprehensive care curriculum, PRCE=procedural requirements curriculum plus externship Figure 5 shows both the mean number of high achievers by discipline by curriculum and the number of high achievers for total experience when externship data were excluded. There were significant differences on this measure among all three curricula. Figure 6 shows both the mean number of low achievers by discipline by curricula and the number of low achievers for total experience when externship data were excluded. There were significant differences on this measure among all three curricula. Discussion The results of our study provide solid quantitative insight into the effects that different clinical curricula have had on the clinical experience of UCSF dental school graduates between 1992 and 2013. Significant differences among curricula existed for almost all factors measured. Effect of Externships The addition of off-campus externships in 2005 greatly increased the mean clinical experience of students. This finding is consistent with research on April 2017 ■ Journal of Dental Education externships at other dental schools.15,16,19 Our externships also had the effect of significantly reducing the amount of on-campus clinical experience. While one would expect a decrease due to the amount of time spent off campus, the decrease exceeded the proportional time spent away. From a quantitative standpoint, it is unfortunate that we did not have student totals by discipline while on externships. Looking at the total mean RVUs including externships, we can certainly see that students were busy with patient care although the nature of those activities remains a mystery at this point. However, it should be noted that it is unlikely that students completed many, if any, removable or fixed prosthodontic procedures due to the multiappointment nature of those procedures. It should be noted that there are several distinctive and important qualities of the externship experience that do not show up in our data. These include the following: 1) student exposure to rural and underserved populations; 2) student exposure to a busy practice setting; 3) student exposure to working with allied dental personnel and/or performing four-handed dentistry; and 4) student exposure to faculty members, philosophies of care, methods, and 373 Figure 5. Mean number of high achievers above high experience threshold (HET) by discipline, by curriculum DBC=discipline-based curriculum, CCC=comprehensive care curriculum, PRCE=procedural requirements curriculum plus externship Figure 6. Mean number of low achievers below minimum experience threshold (MET) by discipline, by curriculum DBC=discipline-based curriculum, CCC=comprehensive care curriculum, PRCE=procedural requirements curriculum plus externship 374 Journal of Dental Education ■ Volume 81, Number 4 materials that are different from those they encounter on campus. Comparisons of Three Curricula It is clear that overall student production measured in EQAs by students increased in the CCC years as compared to the DBC and PRCE years. One possible explanation for this is that students took more responsibility for their education once the focus on individual procedures was removed. Our data suggest that the move to comprehensive care in 1996, eliminating individual procedure or discipline-based clinical course requirements, generally had no negative effect on students’ clinical experience. On the contrary, allowing students to focus on patient care instead of specific procedures seems to have had the effect of significantly increasing their clinical experience in all but a few areas. We note that students’ experience in removable prosthodontics did decrease during the CCC years, and this trend continued throughout the PRCE years as well. Several factors could account for this decline. Students may have avoided more complex procedures after the DBC years because they could still graduate by focusing on less complicated tasks; however, RVUs were adjusted for complexity and we think it unlikely that students would have found it easier to pursue a large number of simple procedures in order to make up for the difference in RVU values. Another possible explanation is that patient populations changed substantially over the years with less demand for removable prosthodontics, either due to oral condition or economic factors. We should also note that, in the case of fixed prosthodontic experiences, RVUs could be earned for either single unit crowns or multiple unit bridges. As these two were not distinguished in our data, we cannot tell how many students performed crowns and how many performed bridges. In the case of removable prosthodontic experience, the same ambiguity exists. As students could earn removable prosthodontic RVUs for either full or partial dentures, we do not know how many students were able to graduate with no full denture experience or no partial denture experience. Turning to the high and low achiever results, we note that the percentage of high achievers in each discipline area generally increased in the CCC and the percentage of low achievers decreased in the CCC as compared to the other two curricula. Consistent with the finding that the mean removable April 2017 ■ Journal of Dental Education prosthodontic experience decreased from the DBC to CCC and then even more so with the PRCE, the number of low achievers in removable prosthodontics increased, and the number of high achievers increased. With the exception of restorative experience, the number of high achievers decreased and the number of low achievers increased in all other disciplines in the PRCE. Implications and Directions for Future Research Currently, dental schools have only a few ways to compare the clinical experience of their graduates to those in other programs. The American Dental Association (ADA) compares dental schools on a recurring basis, including hours of instruction, revenue produced, and other metrics.27,28 The problems with utilizing these data in any meaningful way are that each school often defines the metrics itself and there is no apparent consistent methodology applied to all schools. Even within a particular school, reporting methodologies can vary from year to year depending on administrative preferences. For instance, the UCSF student dental clinic revenue per enrollment over a four-year period varied by as much as 30%, with rankings between 14 and 33 among dental schools. Revenue does not equate to dental experience as it is related to payment and can vary by patient reimbursement method (cash, insurance, general assistance). Therefore, for purposes of evaluating clinical experience, the ADA surveys are of limited value, although they provide insight in school-to-school comparisons. A limitation of our study is that our data were for only one U.S. dental school, so our results are not generalizable to other schools. However, we feel that the metrics used in this study (PVs, EQAs, and RVUs) could be of great value for comparisons among schools nationwide if these data were collected at every school. Comparisons between our institution and other dental institutions could easily be made with the capabilities of modern dental informatics using the same scripts in the electronic health record. This type of interinstitutional sharing of information is already happening through the electronic health records at a limited number of schools.29,30 Other limitations of this study include not measuring such things as patient satisfaction, patient care outcomes, or student and faculty perceptions of the different curricula, which are all important qualitative aspects of clinical education that are worthy of future 375 evaluation. In addition, as the use of communitybased dental education (externships) continues and increases, further exploration of both the quantity and quality of that type of clinical education in more detail using the metrics described here might provide salient information to guide curriculum development and refinement. Meaningful discussion of what is working and what is not working for a school in regards to clinical experience could help guide curriculum and clinical planning and no doubt help to improve educational objectives. Conclusion The results of our study suggest that the metrics of PVs, RVUs, and EQAs were reasonably useful for measuring students’ experience in one dental school’s clinical curriculum and therefore present the opportunity to create uniform metrics for measuring students’ clinical experience between dental schools. We found that the MET and HET analyses were also reasonably useful for evaluating clinical experience, providing further insight into the clinical experience of individual students. In general, dental schools ought to continue their laudable history of innovation in clinical curricula; however, they should also track the changes these innovations have on clinical experience. This study was primarily undertaken to address concerns that our faculty had about the concept of comprehensive care. We conclude from our analysis that comprehensive care had little or no negative effect on clinical experience overall as compared to the other two curricula, nor did it negatively impact the number of high and low achievers in each graduating class. On the contrary, by many measures, the comprehensive care curriculum significantly enhanced students’ procedural experience. Finally, our study showed that the addition of externships to the curriculum can greatly enhance overall clinical experience although more research is needed to determine the nature and effectiveness of this type of clinical setting. We believe that, beyond ensuring a minimal number of procedures for each student, there are other, more important, considerations in designing a clinical curriculum. These include creation of a better and more supportive learning environment that emphasizes competence; optimization of faculty talents and facility resources; and incorporation of evidence-based diagnosis, risk assessment and treatment planning, and service to underserved populations. In all cases, we should focus on treating 376 patients and students in the most ethical and humane manner possible. Acknowledgments We gratefully acknowledge the UCSF School of Dentistry Network Information Systems team for painstakingly extracting and translating the student experience data from the electronic health record over the many years of this project. We also thank Exan Corporation for development and implementation of the relative value unit metrics and for assistance in identification of appropriate data fields from the data dictionary of the axiUm electronic health record. REFERENCES 1. Dehghan M, Harrison J, Langham S, et al. Comparing comprehensive care and departmental clinical education models: students’ perceptions at the University of Tennessee College of Dentistry. J Dent Educ 2015;79(2):133-9. 2. Dodge WW, Dale RA, Hendricson WD. A preliminary study of the effect of eliminating requirements on clinical performance. J Dent Educ 1993;57(9):667-72. 3. Evangelidis-Sakellson V. Student productivity under requirement and comprehensive care systems. J Dent Educ 1999;63(6):407-13. 4. Hicks JL, Dale RA, Hendricson WD, Lauer WR. Effects of reducing senior clinical requirements. J Dent Educ 1985;49(3):169-75. 5. Holmes DC, Boston DW, Budenz AW, Licari FW. Predoctoral clinical curriculum models at U.S. and Canadian dental schools. J Dent Educ 2003;67(12):1302-11. 6. Holmes DC, Trombly RM, Garcia LT, et al. Student productivity in a comprehensive care program without numeric requirements. J Dent Educ 2000;64(11):745-54. 7. Park SE, Susarla HK, Nalliah R, et al. Does a case completion curriculum influence dental students’ clinical productivity? J Dent Educ 2012;76(5):602-8. 8. Park SE, Timothé P, Nalliah R, et al. A case completion curriculum for clinical dental education: replacing numerical requirements with patient-based comprehensive care. J Dent Educ 2011;75(11):1411-6. 9. Badner V, Ahluwalia KP, Murrman MK, et al. A competency-based framework for training in advanced dental education: experience in a community-based dental partnership program. J Dent Educ 2010;74(2):130-9. 10. Licari FW, Chambers DW. Some paradoxes in competency-based dental education. J Dent Educ 2008;72(1):8-18. 11. Lipp MJ. A process for developing assessments and instruction in competency-based dental education. J Dent Educ 2010;74(5):499-509. 12. Taleghani M, Solomon ES, Wathen WF. Non-graded clinical evaluation of dental students in a competency-based education program. J Dent Educ 2004;68(6):644-55. 13. Gadbury-Amyot CC, McCracken MS, Woldt JL, Brennan R. Implementation of portfolio assessment of student competence in two dental school populations. J Dent Educ 2012;76(12):1559-71. Journal of Dental Education ■ Volume 81, Number 4 14. Gadbury-Amyot CC, McCracken MS, Woldt JL, Brennan RL. Validity and reliability of portfolio assessment of student competence in two dental school populations: a four-year study. J Dent Educ 2014;78(5):657-67. 15. Arevalo O, Saman DM, Rohall V. Measuring clinical productivity in community-based dental education programs. J Dent Educ 2011;75(9):1200-7. 16. Knight GW. Community-based dental education at the University of Illinois at Chicago. J Dent Educ 2011;75(10 Suppl):S14-20. 17. Mashabi S, Mascarenhas AK. Impact of community externships on the clinical performance of senior dental students. J Dent Educ 2011;75(10 Suppl):S36-41. 18. Chambers DW. Preliminary evidence for a general competency hypothesis. J Dent Educ 2001;65(11):1243-52. 19. Chambers D. Learning curves: what do dental students learn from repeated practice of clinical procedures? J Dent Educ 2012;76(3):291-302. 20. Stacey MA, Morgan MV, Wright C. The effect of clinical targets on productivity and perceptions of clinical competency. J Dent Educ 1998;62(6):409-14. 21. Spector M, Holmes DC, Doering JV. Correlation of quantity of dental students’ clinical experiences with faculty evaluation of overall clinical competence: a twenty-two-year retrospective investigation. J Dent Educ 2008;72(12):1465-71. 22. Ryder MI, Sargent P, Perry D. Evolution and revolution: the curriculum reform process at UCSF. J Dent Educ 2008;72(12):1516-30. 23. Perez FA, Allareddy V, Howell H, Karimbux N. Comparison of clinical productivity of senior dental students in a April 2017 ■ Journal of Dental Education dental school teaching clinic versus community externship rotations. J Dent Educ 2010;74(10):1125-32. 24. Bean CY, Rowland ML, Soller H, et al. Comparing fourthyear dental student productivity and experiences in a dental school with community-based clinical education. J Dent Educ 2007;71(8):1020-6. 25. Arevalo O, Saman DM, Rohall V. Measuring clinical productivity in community-based dental education programs. J Dent Educ 2011;75(9):1200-7. 26. Teich ST, Roperto R, Alonso AA, Lang LA. Design and outcomes of a comprehensive care experience level system to evaluate and monitor dental students’ clinical progress. J Dent Educ 2016;80(6):662-9. 27. American Dental Association, Health Policy Institute. Survey of dental education series, report 4: curriculum, 2015-16. At: www.ada.org/~/media/ADA/Science%20 and%20Research/HPI/Files/2015-16_SDE4_final. xlsx?la=en. Accessed 9 Feb. 2017. 28. American Dental Association, Health Policy Institute. Survey of dental education series, report 3: finances, 201516. At: www.ada.org/~/media/ADA/Science%20and%20 Research/HPI/Files/2015-16_SDE3_final.xlsx?la=en. Accessed 9 Feb. 2017. 29. Tokede O, White J, Stark PC, et al. Assessing use of a standardized dental diagnostic terminology in an electronic health record. J Dent Educ 2013;77(1):24-36. 30. Schleyer TK, Thyvalikakath TP, Spallek H, et al. From information technology to informatics: the information revolution in dental education. J Dent Educ 2012;76(1):142-53. 377