Significant efficiency findings while controlling for the frequent confounders of CAI research in the PlanAlyzer project's computer-based, self-paced, case-based programs in anemia and chest pain diagnosis

Harold C. Lyon, James C. Healy, James R. Bell, Joseph F. O'Donnell, Edward K. Shultz, Robert Swift Wigton, Frank Hirai, J. Robert Beck

Research output: Contribution to journalArticle

20 Citations (Scopus)

Abstract

Richard E. Clark in his widely published comprehensive studies and meta-analyses of the literature on computer assisted instruction (CAI) has decried the lack of carefully controlled research, challenging almost every study which shows the computer-based intervention to result in significant post-test proficiency gains over a non-computer-based intervention. We report on a randomized study in a medical school setting where the usual confounders found by Clark to plague most research, were carefully controlled. PlanAlyzer is a microcomputer-based, self-paced, case-based, event-driven system for medical education which was developed and used in carefully controlled trials in a second year medical school curriculum to test the hypothesis that students with access to the interactive programs could integrate their didactic knowledge more effectively and/or efficiently than with access only to traditional textual "nonintelligent" materials. PlanAlyzer presents cases, elicits and critiques a student's approach to the diagnosis of two common medical disorders: anemias and chest pain. PlanAlyzer uses text, hypertext, images and critiquing theory. Students were randomized, one half becoming the experimental group who received the interactive PlanAlyzer cases in anemia, the other half becoming the controls who received the exact same content material in a text format. Later in each year there was a crossover, the controls becoming the experimentals for a similar intervention with the cardiology PlanAlyzer cases. Preliminary results at the end of the first two full trials shows that the programs have achieved most of the proposed instructional objectives, plus some significant efficiency and economy gains. 96 faculty hours of classroom time were saved by using PlanAlyzer in their place, while maintaining high student achievement. In terms of student proficiency and efficiency, the 328 students in the trials over two years were able to accomplish the project's instructional objectives, and the experimentals accomplished this in 43% less time than the controls, achieving the same level of mastery. However, in spite of these significant efficiency findings, there have been no significant proficiency differences (as measured by current factual and higher order multiple choice post-tests) between the experimental and control groups. Very careful controls were used to avoid what Clark has found to be the most common confounders of CAI research. Accordingly, this research proved Clark's rival hypothesis, that the computer, in itself, does not appear to contribute to proficiency gains, at least as measured by our limited post-testing. Clark's position is that the computer is primarily a vehicle-as is either a pill or a hypodermic needle for delivering a drug. The hypodermic needle can deliver the drug more efficiently than can the pill, (as can the computer deliver the subject matter content more efficiently, as our research indicates), but the same content is delivered. At the same time, we proved our own hypothesis, as far as efficiency gains resulting from the computer are concerned. However, going beyond Clark's research, we may be teaching processes both more effectively and efficiently with the computer (experience in problem-solving or clinical reasoning and pattern recognition) which our current post-tests do not adequately measure. Our on-going research suggests additional inquiry in several areas: better evaluation instruments to measure the clinical reasoning skills PlanAlyzer was designed to teach; the addition of more advanced cases to determine if this might transform efficiency gains of the computer group into proficiency gains; the addition of enhanced graphic decision support tools and other pedagogical enhancements including cognitive feedback to strengthen PlanAlyzer's power to teach complex concepts of medical decision-making.

Original languageEnglish (US)
Pages (from-to)117-132
Number of pages16
JournalJournal of Medical Systems
Volume15
Issue number2
DOIs
StatePublished - Apr 1 1991

Fingerprint

Computer-Assisted Instruction
Chest Pain
Anemia
Students
Research
Efficiency
Medical Schools
Needles
Hypermedia
Clinical Competence
Plague
Microcomputers
Medical education
Cardiology
Medical Education
Pharmaceutical Preparations
Curriculum
Meta-Analysis
Teaching
Curricula

ASJC Scopus subject areas

  • Medicine (miscellaneous)
  • Information Systems
  • Health Informatics
  • Health Information Management

Cite this

Significant efficiency findings while controlling for the frequent confounders of CAI research in the PlanAlyzer project's computer-based, self-paced, case-based programs in anemia and chest pain diagnosis. / Lyon, Harold C.; Healy, James C.; Bell, James R.; O'Donnell, Joseph F.; Shultz, Edward K.; Wigton, Robert Swift; Hirai, Frank; Beck, J. Robert.

In: Journal of Medical Systems, Vol. 15, No. 2, 01.04.1991, p. 117-132.

Research output: Contribution to journalArticle

Lyon, Harold C. ; Healy, James C. ; Bell, James R. ; O'Donnell, Joseph F. ; Shultz, Edward K. ; Wigton, Robert Swift ; Hirai, Frank ; Beck, J. Robert. / Significant efficiency findings while controlling for the frequent confounders of CAI research in the PlanAlyzer project's computer-based, self-paced, case-based programs in anemia and chest pain diagnosis. In: Journal of Medical Systems. 1991 ; Vol. 15, No. 2. pp. 117-132.
@article{f8637e7ddd3f4ba5a3a2dd7ac7df7768,
title = "Significant efficiency findings while controlling for the frequent confounders of CAI research in the PlanAlyzer project's computer-based, self-paced, case-based programs in anemia and chest pain diagnosis",
abstract = "Richard E. Clark in his widely published comprehensive studies and meta-analyses of the literature on computer assisted instruction (CAI) has decried the lack of carefully controlled research, challenging almost every study which shows the computer-based intervention to result in significant post-test proficiency gains over a non-computer-based intervention. We report on a randomized study in a medical school setting where the usual confounders found by Clark to plague most research, were carefully controlled. PlanAlyzer is a microcomputer-based, self-paced, case-based, event-driven system for medical education which was developed and used in carefully controlled trials in a second year medical school curriculum to test the hypothesis that students with access to the interactive programs could integrate their didactic knowledge more effectively and/or efficiently than with access only to traditional textual {"}nonintelligent{"} materials. PlanAlyzer presents cases, elicits and critiques a student's approach to the diagnosis of two common medical disorders: anemias and chest pain. PlanAlyzer uses text, hypertext, images and critiquing theory. Students were randomized, one half becoming the experimental group who received the interactive PlanAlyzer cases in anemia, the other half becoming the controls who received the exact same content material in a text format. Later in each year there was a crossover, the controls becoming the experimentals for a similar intervention with the cardiology PlanAlyzer cases. Preliminary results at the end of the first two full trials shows that the programs have achieved most of the proposed instructional objectives, plus some significant efficiency and economy gains. 96 faculty hours of classroom time were saved by using PlanAlyzer in their place, while maintaining high student achievement. In terms of student proficiency and efficiency, the 328 students in the trials over two years were able to accomplish the project's instructional objectives, and the experimentals accomplished this in 43{\%} less time than the controls, achieving the same level of mastery. However, in spite of these significant efficiency findings, there have been no significant proficiency differences (as measured by current factual and higher order multiple choice post-tests) between the experimental and control groups. Very careful controls were used to avoid what Clark has found to be the most common confounders of CAI research. Accordingly, this research proved Clark's rival hypothesis, that the computer, in itself, does not appear to contribute to proficiency gains, at least as measured by our limited post-testing. Clark's position is that the computer is primarily a vehicle-as is either a pill or a hypodermic needle for delivering a drug. The hypodermic needle can deliver the drug more efficiently than can the pill, (as can the computer deliver the subject matter content more efficiently, as our research indicates), but the same content is delivered. At the same time, we proved our own hypothesis, as far as efficiency gains resulting from the computer are concerned. However, going beyond Clark's research, we may be teaching processes both more effectively and efficiently with the computer (experience in problem-solving or clinical reasoning and pattern recognition) which our current post-tests do not adequately measure. Our on-going research suggests additional inquiry in several areas: better evaluation instruments to measure the clinical reasoning skills PlanAlyzer was designed to teach; the addition of more advanced cases to determine if this might transform efficiency gains of the computer group into proficiency gains; the addition of enhanced graphic decision support tools and other pedagogical enhancements including cognitive feedback to strengthen PlanAlyzer's power to teach complex concepts of medical decision-making.",
author = "Lyon, {Harold C.} and Healy, {James C.} and Bell, {James R.} and O'Donnell, {Joseph F.} and Shultz, {Edward K.} and Wigton, {Robert Swift} and Frank Hirai and Beck, {J. Robert}",
year = "1991",
month = "4",
day = "1",
doi = "10.1007/BF00992704",
language = "English (US)",
volume = "15",
pages = "117--132",
journal = "Journal of Medical Systems",
issn = "0148-5598",
publisher = "Springer New York",
number = "2",

}

TY - JOUR

T1 - Significant efficiency findings while controlling for the frequent confounders of CAI research in the PlanAlyzer project's computer-based, self-paced, case-based programs in anemia and chest pain diagnosis

AU - Lyon, Harold C.

AU - Healy, James C.

AU - Bell, James R.

AU - O'Donnell, Joseph F.

AU - Shultz, Edward K.

AU - Wigton, Robert Swift

AU - Hirai, Frank

AU - Beck, J. Robert

PY - 1991/4/1

Y1 - 1991/4/1

N2 - Richard E. Clark in his widely published comprehensive studies and meta-analyses of the literature on computer assisted instruction (CAI) has decried the lack of carefully controlled research, challenging almost every study which shows the computer-based intervention to result in significant post-test proficiency gains over a non-computer-based intervention. We report on a randomized study in a medical school setting where the usual confounders found by Clark to plague most research, were carefully controlled. PlanAlyzer is a microcomputer-based, self-paced, case-based, event-driven system for medical education which was developed and used in carefully controlled trials in a second year medical school curriculum to test the hypothesis that students with access to the interactive programs could integrate their didactic knowledge more effectively and/or efficiently than with access only to traditional textual "nonintelligent" materials. PlanAlyzer presents cases, elicits and critiques a student's approach to the diagnosis of two common medical disorders: anemias and chest pain. PlanAlyzer uses text, hypertext, images and critiquing theory. Students were randomized, one half becoming the experimental group who received the interactive PlanAlyzer cases in anemia, the other half becoming the controls who received the exact same content material in a text format. Later in each year there was a crossover, the controls becoming the experimentals for a similar intervention with the cardiology PlanAlyzer cases. Preliminary results at the end of the first two full trials shows that the programs have achieved most of the proposed instructional objectives, plus some significant efficiency and economy gains. 96 faculty hours of classroom time were saved by using PlanAlyzer in their place, while maintaining high student achievement. In terms of student proficiency and efficiency, the 328 students in the trials over two years were able to accomplish the project's instructional objectives, and the experimentals accomplished this in 43% less time than the controls, achieving the same level of mastery. However, in spite of these significant efficiency findings, there have been no significant proficiency differences (as measured by current factual and higher order multiple choice post-tests) between the experimental and control groups. Very careful controls were used to avoid what Clark has found to be the most common confounders of CAI research. Accordingly, this research proved Clark's rival hypothesis, that the computer, in itself, does not appear to contribute to proficiency gains, at least as measured by our limited post-testing. Clark's position is that the computer is primarily a vehicle-as is either a pill or a hypodermic needle for delivering a drug. The hypodermic needle can deliver the drug more efficiently than can the pill, (as can the computer deliver the subject matter content more efficiently, as our research indicates), but the same content is delivered. At the same time, we proved our own hypothesis, as far as efficiency gains resulting from the computer are concerned. However, going beyond Clark's research, we may be teaching processes both more effectively and efficiently with the computer (experience in problem-solving or clinical reasoning and pattern recognition) which our current post-tests do not adequately measure. Our on-going research suggests additional inquiry in several areas: better evaluation instruments to measure the clinical reasoning skills PlanAlyzer was designed to teach; the addition of more advanced cases to determine if this might transform efficiency gains of the computer group into proficiency gains; the addition of enhanced graphic decision support tools and other pedagogical enhancements including cognitive feedback to strengthen PlanAlyzer's power to teach complex concepts of medical decision-making.

AB - Richard E. Clark in his widely published comprehensive studies and meta-analyses of the literature on computer assisted instruction (CAI) has decried the lack of carefully controlled research, challenging almost every study which shows the computer-based intervention to result in significant post-test proficiency gains over a non-computer-based intervention. We report on a randomized study in a medical school setting where the usual confounders found by Clark to plague most research, were carefully controlled. PlanAlyzer is a microcomputer-based, self-paced, case-based, event-driven system for medical education which was developed and used in carefully controlled trials in a second year medical school curriculum to test the hypothesis that students with access to the interactive programs could integrate their didactic knowledge more effectively and/or efficiently than with access only to traditional textual "nonintelligent" materials. PlanAlyzer presents cases, elicits and critiques a student's approach to the diagnosis of two common medical disorders: anemias and chest pain. PlanAlyzer uses text, hypertext, images and critiquing theory. Students were randomized, one half becoming the experimental group who received the interactive PlanAlyzer cases in anemia, the other half becoming the controls who received the exact same content material in a text format. Later in each year there was a crossover, the controls becoming the experimentals for a similar intervention with the cardiology PlanAlyzer cases. Preliminary results at the end of the first two full trials shows that the programs have achieved most of the proposed instructional objectives, plus some significant efficiency and economy gains. 96 faculty hours of classroom time were saved by using PlanAlyzer in their place, while maintaining high student achievement. In terms of student proficiency and efficiency, the 328 students in the trials over two years were able to accomplish the project's instructional objectives, and the experimentals accomplished this in 43% less time than the controls, achieving the same level of mastery. However, in spite of these significant efficiency findings, there have been no significant proficiency differences (as measured by current factual and higher order multiple choice post-tests) between the experimental and control groups. Very careful controls were used to avoid what Clark has found to be the most common confounders of CAI research. Accordingly, this research proved Clark's rival hypothesis, that the computer, in itself, does not appear to contribute to proficiency gains, at least as measured by our limited post-testing. Clark's position is that the computer is primarily a vehicle-as is either a pill or a hypodermic needle for delivering a drug. The hypodermic needle can deliver the drug more efficiently than can the pill, (as can the computer deliver the subject matter content more efficiently, as our research indicates), but the same content is delivered. At the same time, we proved our own hypothesis, as far as efficiency gains resulting from the computer are concerned. However, going beyond Clark's research, we may be teaching processes both more effectively and efficiently with the computer (experience in problem-solving or clinical reasoning and pattern recognition) which our current post-tests do not adequately measure. Our on-going research suggests additional inquiry in several areas: better evaluation instruments to measure the clinical reasoning skills PlanAlyzer was designed to teach; the addition of more advanced cases to determine if this might transform efficiency gains of the computer group into proficiency gains; the addition of enhanced graphic decision support tools and other pedagogical enhancements including cognitive feedback to strengthen PlanAlyzer's power to teach complex concepts of medical decision-making.

UR - http://www.scopus.com/inward/record.url?scp=0025749702&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0025749702&partnerID=8YFLogxK

U2 - 10.1007/BF00992704

DO - 10.1007/BF00992704

M3 - Article

C2 - 1757751

AN - SCOPUS:0025749702

VL - 15

SP - 117

EP - 132

JO - Journal of Medical Systems

JF - Journal of Medical Systems

SN - 0148-5598

IS - 2

ER -