Clinical experience and examination performance: Is there a correlation?

Gary L. Beck, Mihaela T Matache, Carrie Riha, Katherine Kerber, Fredrick A. McCurdy

Research output: Contribution to journalArticle

7 Citations (Scopus)

Abstract

Context: The Liaison Committee on Medical Education (LCME) requires there to be: '...comparable educational experiences and equivalent methods of evaluation across all alternative instructional sites within a given discipline'. It is an LCME accreditation requirement that students encounter similar numbers of patients with similar diagnoses. However, previous empirical studies have not shown a correlation between the numbers of patients seen by students and performance on multiple-choice examinations. Objective: This study examined whether student exposure to patients with specific diagnoses predicts performance on multiple-choice examination questions pertaining to those diagnoses. Methods: The Department of Pediatrics at the University of Nebraska Medical Center has collected patient logbooks from clerks since 1994. These contain information on patient demographics and students' roles in patient care. During week 7 of an 8-week course, students took an examination intended to help them prepare for their final examination. Logbooks and pre-examination questions were coded using standard ICD-9 codes. Data were analysed using Minitab statistical software to determine dependence between patient encounters and test scores. Subjects comprised a convenience sample of students who completed the clerkship during 1997-2000. Results: Our analysis indicates that performance on a multiple-choice examination is independent of the number of patients seen. Conclusions: Our data suggest knowledge-based examination performance cannot be predicted by the volume of patients seen. Therefore, too much emphasis on examination performance in clinical courses should be carefully weighed against clinical performance to determine the successful completion of clerkships.

Original languageEnglish (US)
Pages (from-to)550-555
Number of pages6
JournalMedical Education
Volume41
Issue number6
DOIs
StatePublished - Jun 1 2007

Fingerprint

examination
Students
performance
experience
student
International Classification of Diseases
Medical Education
Accreditation
accreditation
patient care
Patient Care
education
Software
Demography
Pediatrics
evaluation
knowledge

Keywords

  • Ambulatory care
  • Clinical clerkship/*standards
  • Community medicine/education
  • Educational measurement/*methods
  • Nebraska
  • Paediatrics/*education
  • Professional-patient relations
  • Teaching materials

ASJC Scopus subject areas

  • Education

Cite this

Clinical experience and examination performance : Is there a correlation? / Beck, Gary L.; Matache, Mihaela T; Riha, Carrie; Kerber, Katherine; McCurdy, Fredrick A.

In: Medical Education, Vol. 41, No. 6, 01.06.2007, p. 550-555.

Research output: Contribution to journalArticle

Beck, Gary L. ; Matache, Mihaela T ; Riha, Carrie ; Kerber, Katherine ; McCurdy, Fredrick A. / Clinical experience and examination performance : Is there a correlation?. In: Medical Education. 2007 ; Vol. 41, No. 6. pp. 550-555.
@article{fb0e76912d134e68b6e155f5c732c789,
title = "Clinical experience and examination performance: Is there a correlation?",
abstract = "Context: The Liaison Committee on Medical Education (LCME) requires there to be: '...comparable educational experiences and equivalent methods of evaluation across all alternative instructional sites within a given discipline'. It is an LCME accreditation requirement that students encounter similar numbers of patients with similar diagnoses. However, previous empirical studies have not shown a correlation between the numbers of patients seen by students and performance on multiple-choice examinations. Objective: This study examined whether student exposure to patients with specific diagnoses predicts performance on multiple-choice examination questions pertaining to those diagnoses. Methods: The Department of Pediatrics at the University of Nebraska Medical Center has collected patient logbooks from clerks since 1994. These contain information on patient demographics and students' roles in patient care. During week 7 of an 8-week course, students took an examination intended to help them prepare for their final examination. Logbooks and pre-examination questions were coded using standard ICD-9 codes. Data were analysed using Minitab statistical software to determine dependence between patient encounters and test scores. Subjects comprised a convenience sample of students who completed the clerkship during 1997-2000. Results: Our analysis indicates that performance on a multiple-choice examination is independent of the number of patients seen. Conclusions: Our data suggest knowledge-based examination performance cannot be predicted by the volume of patients seen. Therefore, too much emphasis on examination performance in clinical courses should be carefully weighed against clinical performance to determine the successful completion of clerkships.",
keywords = "Ambulatory care, Clinical clerkship/*standards, Community medicine/education, Educational measurement/*methods, Nebraska, Paediatrics/*education, Professional-patient relations, Teaching materials",
author = "Beck, {Gary L.} and Matache, {Mihaela T} and Carrie Riha and Katherine Kerber and McCurdy, {Fredrick A.}",
year = "2007",
month = "6",
day = "1",
doi = "10.1111/j.1365-2923.2007.02764.x",
language = "English (US)",
volume = "41",
pages = "550--555",
journal = "Medical Education",
issn = "0308-0110",
publisher = "Wiley-Blackwell",
number = "6",

}

TY - JOUR

T1 - Clinical experience and examination performance

T2 - Is there a correlation?

AU - Beck, Gary L.

AU - Matache, Mihaela T

AU - Riha, Carrie

AU - Kerber, Katherine

AU - McCurdy, Fredrick A.

PY - 2007/6/1

Y1 - 2007/6/1

N2 - Context: The Liaison Committee on Medical Education (LCME) requires there to be: '...comparable educational experiences and equivalent methods of evaluation across all alternative instructional sites within a given discipline'. It is an LCME accreditation requirement that students encounter similar numbers of patients with similar diagnoses. However, previous empirical studies have not shown a correlation between the numbers of patients seen by students and performance on multiple-choice examinations. Objective: This study examined whether student exposure to patients with specific diagnoses predicts performance on multiple-choice examination questions pertaining to those diagnoses. Methods: The Department of Pediatrics at the University of Nebraska Medical Center has collected patient logbooks from clerks since 1994. These contain information on patient demographics and students' roles in patient care. During week 7 of an 8-week course, students took an examination intended to help them prepare for their final examination. Logbooks and pre-examination questions were coded using standard ICD-9 codes. Data were analysed using Minitab statistical software to determine dependence between patient encounters and test scores. Subjects comprised a convenience sample of students who completed the clerkship during 1997-2000. Results: Our analysis indicates that performance on a multiple-choice examination is independent of the number of patients seen. Conclusions: Our data suggest knowledge-based examination performance cannot be predicted by the volume of patients seen. Therefore, too much emphasis on examination performance in clinical courses should be carefully weighed against clinical performance to determine the successful completion of clerkships.

AB - Context: The Liaison Committee on Medical Education (LCME) requires there to be: '...comparable educational experiences and equivalent methods of evaluation across all alternative instructional sites within a given discipline'. It is an LCME accreditation requirement that students encounter similar numbers of patients with similar diagnoses. However, previous empirical studies have not shown a correlation between the numbers of patients seen by students and performance on multiple-choice examinations. Objective: This study examined whether student exposure to patients with specific diagnoses predicts performance on multiple-choice examination questions pertaining to those diagnoses. Methods: The Department of Pediatrics at the University of Nebraska Medical Center has collected patient logbooks from clerks since 1994. These contain information on patient demographics and students' roles in patient care. During week 7 of an 8-week course, students took an examination intended to help them prepare for their final examination. Logbooks and pre-examination questions were coded using standard ICD-9 codes. Data were analysed using Minitab statistical software to determine dependence between patient encounters and test scores. Subjects comprised a convenience sample of students who completed the clerkship during 1997-2000. Results: Our analysis indicates that performance on a multiple-choice examination is independent of the number of patients seen. Conclusions: Our data suggest knowledge-based examination performance cannot be predicted by the volume of patients seen. Therefore, too much emphasis on examination performance in clinical courses should be carefully weighed against clinical performance to determine the successful completion of clerkships.

KW - Ambulatory care

KW - Clinical clerkship/standards

KW - Community medicine/education

KW - Educational measurement/methods

KW - Nebraska

KW - Paediatrics/education

KW - Professional-patient relations

KW - Teaching materials

UR - http://www.scopus.com/inward/record.url?scp=34249072204&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=34249072204&partnerID=8YFLogxK

U2 - 10.1111/j.1365-2923.2007.02764.x

DO - 10.1111/j.1365-2923.2007.02764.x

M3 - Article

C2 - 17518834

AN - SCOPUS:34249072204

VL - 41

SP - 550

EP - 555

JO - Medical Education

JF - Medical Education

SN - 0308-0110

IS - 6

ER -