Determining differences in user performance between expert and novice primary care doctors when using an electronic health record (EHR)

Martina A. Clarke, Jeffery L. Belden, Min Soon Kim

Research output: Contribution to journalArticle

15 Citations (Scopus)

Abstract

Rationale, aims and objectives The goal of this study is to determine usability gaps between expert and novice primary care doctors when using an electronic health record (EHR). Methods Usability tests using video analyses with triangular method approach were conducted to analyse usability gaps between 10 novice and seven expert doctors. Doctors completed 19 tasks, using think-aloud strategy, based on an artificial but typical patient visit note. The usability session lasted approximately 20 minutes. The testing room consisted of the participant and the facilitator. Mixed methods approach including four sets of performance measures, system usability scale (SUS), and debriefing session with participants was used. Results While most expert doctors completed tasks more efficiently, and provided a higher SUS score than novice doctors (novice 68, expert 70 out of 100 being perfect score), the result of 'percent task success rate' were comparable (74% for expert group, 78% for novice group, P = 0.98) on all 19 tasks. Conclusion This study found a lack of expertise among doctors with more experience using the system demonstrating that although expert doctors have been using the system longer, their proficiency did not increase with EHR experience. These results may potentially improve the EHR training programme, which may increase doctors' performance when using an EHR. These results may also assist EHR vendors in improving the user interface, which may aid in reducing errors caused from poor usability of the system.

Original languageEnglish (US)
Pages (from-to)1153-1161
Number of pages9
JournalJournal of Evaluation in Clinical Practice
Volume20
Issue number6
DOIs
StatePublished - Dec 1 2014

Fingerprint

Electronic Health Records
Primary Health Care
Education

Keywords

  • evaluation
  • medical informatics

ASJC Scopus subject areas

  • Health Policy
  • Public Health, Environmental and Occupational Health

Cite this

Determining differences in user performance between expert and novice primary care doctors when using an electronic health record (EHR). / Clarke, Martina A.; Belden, Jeffery L.; Kim, Min Soon.

In: Journal of Evaluation in Clinical Practice, Vol. 20, No. 6, 01.12.2014, p. 1153-1161.

Research output: Contribution to journalArticle

@article{0412b3924d9f4504bce9adc143dfe703,
title = "Determining differences in user performance between expert and novice primary care doctors when using an electronic health record (EHR)",
abstract = "Rationale, aims and objectives The goal of this study is to determine usability gaps between expert and novice primary care doctors when using an electronic health record (EHR). Methods Usability tests using video analyses with triangular method approach were conducted to analyse usability gaps between 10 novice and seven expert doctors. Doctors completed 19 tasks, using think-aloud strategy, based on an artificial but typical patient visit note. The usability session lasted approximately 20 minutes. The testing room consisted of the participant and the facilitator. Mixed methods approach including four sets of performance measures, system usability scale (SUS), and debriefing session with participants was used. Results While most expert doctors completed tasks more efficiently, and provided a higher SUS score than novice doctors (novice 68, expert 70 out of 100 being perfect score), the result of 'percent task success rate' were comparable (74{\%} for expert group, 78{\%} for novice group, P = 0.98) on all 19 tasks. Conclusion This study found a lack of expertise among doctors with more experience using the system demonstrating that although expert doctors have been using the system longer, their proficiency did not increase with EHR experience. These results may potentially improve the EHR training programme, which may increase doctors' performance when using an EHR. These results may also assist EHR vendors in improving the user interface, which may aid in reducing errors caused from poor usability of the system.",
keywords = "evaluation, medical informatics",
author = "Clarke, {Martina A.} and Belden, {Jeffery L.} and Kim, {Min Soon}",
year = "2014",
month = "12",
day = "1",
doi = "10.1111/jep.12277",
language = "English (US)",
volume = "20",
pages = "1153--1161",
journal = "Journal of Evaluation in Clinical Practice",
issn = "1356-1294",
publisher = "Wiley-Blackwell",
number = "6",

}

TY - JOUR

T1 - Determining differences in user performance between expert and novice primary care doctors when using an electronic health record (EHR)

AU - Clarke, Martina A.

AU - Belden, Jeffery L.

AU - Kim, Min Soon

PY - 2014/12/1

Y1 - 2014/12/1

N2 - Rationale, aims and objectives The goal of this study is to determine usability gaps between expert and novice primary care doctors when using an electronic health record (EHR). Methods Usability tests using video analyses with triangular method approach were conducted to analyse usability gaps between 10 novice and seven expert doctors. Doctors completed 19 tasks, using think-aloud strategy, based on an artificial but typical patient visit note. The usability session lasted approximately 20 minutes. The testing room consisted of the participant and the facilitator. Mixed methods approach including four sets of performance measures, system usability scale (SUS), and debriefing session with participants was used. Results While most expert doctors completed tasks more efficiently, and provided a higher SUS score than novice doctors (novice 68, expert 70 out of 100 being perfect score), the result of 'percent task success rate' were comparable (74% for expert group, 78% for novice group, P = 0.98) on all 19 tasks. Conclusion This study found a lack of expertise among doctors with more experience using the system demonstrating that although expert doctors have been using the system longer, their proficiency did not increase with EHR experience. These results may potentially improve the EHR training programme, which may increase doctors' performance when using an EHR. These results may also assist EHR vendors in improving the user interface, which may aid in reducing errors caused from poor usability of the system.

AB - Rationale, aims and objectives The goal of this study is to determine usability gaps between expert and novice primary care doctors when using an electronic health record (EHR). Methods Usability tests using video analyses with triangular method approach were conducted to analyse usability gaps between 10 novice and seven expert doctors. Doctors completed 19 tasks, using think-aloud strategy, based on an artificial but typical patient visit note. The usability session lasted approximately 20 minutes. The testing room consisted of the participant and the facilitator. Mixed methods approach including four sets of performance measures, system usability scale (SUS), and debriefing session with participants was used. Results While most expert doctors completed tasks more efficiently, and provided a higher SUS score than novice doctors (novice 68, expert 70 out of 100 being perfect score), the result of 'percent task success rate' were comparable (74% for expert group, 78% for novice group, P = 0.98) on all 19 tasks. Conclusion This study found a lack of expertise among doctors with more experience using the system demonstrating that although expert doctors have been using the system longer, their proficiency did not increase with EHR experience. These results may potentially improve the EHR training programme, which may increase doctors' performance when using an EHR. These results may also assist EHR vendors in improving the user interface, which may aid in reducing errors caused from poor usability of the system.

KW - evaluation

KW - medical informatics

UR - http://www.scopus.com/inward/record.url?scp=84923058817&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84923058817&partnerID=8YFLogxK

U2 - 10.1111/jep.12277

DO - 10.1111/jep.12277

M3 - Article

C2 - 25470668

AN - SCOPUS:84923058817

VL - 20

SP - 1153

EP - 1161

JO - Journal of Evaluation in Clinical Practice

JF - Journal of Evaluation in Clinical Practice

SN - 1356-1294

IS - 6

ER -