The effects of mismatches between survey question stems and response options on data quality and responses

Research output: Contribution to journalArticle

Abstract

Several questionnaire design texts emphasize a dual role of question wording: the wording needs to express what is being measured and tell respondents how to answer. Researchers tend to focus heavily on the first of these goals, but sometimes overlook the second, resulting in question wording that does not match the response options provided (i.e., mismatches). Common examples are yes/no questions with ordinal or nominal response options, open-ended questions with closed-ended response options, and check-all-that apply questions with forced-choice response options. A slightly different type of mismatch utilizes a question stem that can be read as asking for two different types of answers with no indication of which type should be provided. In this paper, we report the results of twenty-two experimental comparisons of data quality indicators (i.e., item nonresponse and response time) and response distributions across matched and mismatched versions of questions from a postal mail survey and a telephone survey. We find that mismatched items generally have lower data quality than matched items and that substantive results differ significantly across matched and mismatched designs, especially in the telephone survey. The results suggest that researchers should be wary of mismatches and should strive for holistic design.

Original languageEnglish (US)
Pages (from-to)34-65
Number of pages32
JournalJournal of Survey Statistics and Methodology
Volume7
Issue number1
DOIs
StatePublished - Mar 1 2019

Fingerprint

Data Quality
data quality
mismatch
Telephone
telephone
dual role
response behavior
mail survey
Item Nonresponse
indication
questionnaire
Questionnaire
Response Time
Categorical or nominal
Express
Mismatch
Data quality
Tend
Closed
Design

Keywords

  • Data quality
  • Mismatches
  • Question wording
  • Questionnaire design
  • Response time

ASJC Scopus subject areas

  • Statistics and Probability
  • Social Sciences (miscellaneous)
  • Statistics, Probability and Uncertainty
  • Applied Mathematics

Cite this

@article{df2758ed5fe648a8b6260baa75561290,
title = "The effects of mismatches between survey question stems and response options on data quality and responses",
abstract = "Several questionnaire design texts emphasize a dual role of question wording: the wording needs to express what is being measured and tell respondents how to answer. Researchers tend to focus heavily on the first of these goals, but sometimes overlook the second, resulting in question wording that does not match the response options provided (i.e., mismatches). Common examples are yes/no questions with ordinal or nominal response options, open-ended questions with closed-ended response options, and check-all-that apply questions with forced-choice response options. A slightly different type of mismatch utilizes a question stem that can be read as asking for two different types of answers with no indication of which type should be provided. In this paper, we report the results of twenty-two experimental comparisons of data quality indicators (i.e., item nonresponse and response time) and response distributions across matched and mismatched versions of questions from a postal mail survey and a telephone survey. We find that mismatched items generally have lower data quality than matched items and that substantive results differ significantly across matched and mismatched designs, especially in the telephone survey. The results suggest that researchers should be wary of mismatches and should strive for holistic design.",
keywords = "Data quality, Mismatches, Question wording, Questionnaire design, Response time",
author = "Smyth, {Jolene D.} and Kristen Olson",
year = "2019",
month = "3",
day = "1",
doi = "10.1093/jssam/smy005",
language = "English (US)",
volume = "7",
pages = "34--65",
journal = "Journal of Survey Statistics and Methodology",
issn = "2325-0984",
publisher = "Oxford University Press",
number = "1",

}

TY - JOUR

T1 - The effects of mismatches between survey question stems and response options on data quality and responses

AU - Smyth, Jolene D.

AU - Olson, Kristen

PY - 2019/3/1

Y1 - 2019/3/1

N2 - Several questionnaire design texts emphasize a dual role of question wording: the wording needs to express what is being measured and tell respondents how to answer. Researchers tend to focus heavily on the first of these goals, but sometimes overlook the second, resulting in question wording that does not match the response options provided (i.e., mismatches). Common examples are yes/no questions with ordinal or nominal response options, open-ended questions with closed-ended response options, and check-all-that apply questions with forced-choice response options. A slightly different type of mismatch utilizes a question stem that can be read as asking for two different types of answers with no indication of which type should be provided. In this paper, we report the results of twenty-two experimental comparisons of data quality indicators (i.e., item nonresponse and response time) and response distributions across matched and mismatched versions of questions from a postal mail survey and a telephone survey. We find that mismatched items generally have lower data quality than matched items and that substantive results differ significantly across matched and mismatched designs, especially in the telephone survey. The results suggest that researchers should be wary of mismatches and should strive for holistic design.

AB - Several questionnaire design texts emphasize a dual role of question wording: the wording needs to express what is being measured and tell respondents how to answer. Researchers tend to focus heavily on the first of these goals, but sometimes overlook the second, resulting in question wording that does not match the response options provided (i.e., mismatches). Common examples are yes/no questions with ordinal or nominal response options, open-ended questions with closed-ended response options, and check-all-that apply questions with forced-choice response options. A slightly different type of mismatch utilizes a question stem that can be read as asking for two different types of answers with no indication of which type should be provided. In this paper, we report the results of twenty-two experimental comparisons of data quality indicators (i.e., item nonresponse and response time) and response distributions across matched and mismatched versions of questions from a postal mail survey and a telephone survey. We find that mismatched items generally have lower data quality than matched items and that substantive results differ significantly across matched and mismatched designs, especially in the telephone survey. The results suggest that researchers should be wary of mismatches and should strive for holistic design.

KW - Data quality

KW - Mismatches

KW - Question wording

KW - Questionnaire design

KW - Response time

UR - http://www.scopus.com/inward/record.url?scp=85062552008&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85062552008&partnerID=8YFLogxK

U2 - 10.1093/jssam/smy005

DO - 10.1093/jssam/smy005

M3 - Article

AN - SCOPUS:85062552008

VL - 7

SP - 34

EP - 65

JO - Journal of Survey Statistics and Methodology

JF - Journal of Survey Statistics and Methodology

SN - 2325-0984

IS - 1

ER -