You detect while I search: examining visual search efficiency in a joint search task

Gerald P. McDonnell, Mark Mills, Jordan E. Marshall, Joshua E. Zosky, Michael D. Dodd

Research output: Contribution to journalArticle

Abstract

Numerous factors impact attentional allocation, with behaviour being strongly influenced by the interaction between individual intent and our visual environment. Traditionally, visual search efficiency has been studied under solo search conditions. Here, we propose a novel joint search paradigm where one individual controls the visual input available to another individual via a gaze contingent window (e.g., Participant 1 controls the window with their eye movements and Participant 2–in an adjoining room–sees only stimuli that Participant 1 is fixating and responds to the target accordingly). Pairs of participants completed three blocks of a detection task that required them to: (1) search and detect the target individually, (2) search the display while their partner performed the detection task, or (3) detect while their partner searched. Search was most accurate when the person detecting was doing so for the second time while the person controlling the visual input was doing so for the first time, even when compared to participants with advanced solo or joint task experience (Experiments 2 and 3). Through surrendering control of one’s search strategy, we posit that there is a benefit of a reduced working memory load for the detector resulting in more accurate search. This paradigm creates a counterintuitive speed/accuracy trade-off which combines the heightened ability that comes from task experience (discrimination task) with the slower performance times associated with a novel task (the initial search) to create a potentially more efficient method of visual search.

Original languageEnglish (US)
Pages (from-to)71-88
Number of pages18
JournalVisual Cognition
Volume26
Issue number2
DOIs
StatePublished - Feb 7 2018

Fingerprint

Joints
Efficiency
Aptitude
Eye Movements
Short-Term Memory
Visual Search
Discrimination (Psychology)

Keywords

  • Visual attention
  • visual search
  • working memory

ASJC Scopus subject areas

  • Experimental and Cognitive Psychology
  • Arts and Humanities (miscellaneous)
  • Cognitive Neuroscience

Cite this

You detect while I search : examining visual search efficiency in a joint search task. / McDonnell, Gerald P.; Mills, Mark; Marshall, Jordan E.; Zosky, Joshua E.; Dodd, Michael D.

In: Visual Cognition, Vol. 26, No. 2, 07.02.2018, p. 71-88.

Research output: Contribution to journalArticle

McDonnell, Gerald P. ; Mills, Mark ; Marshall, Jordan E. ; Zosky, Joshua E. ; Dodd, Michael D. / You detect while I search : examining visual search efficiency in a joint search task. In: Visual Cognition. 2018 ; Vol. 26, No. 2. pp. 71-88.
@article{9a3c95b651c54e0e8d29836096e20a35,
title = "You detect while I search: examining visual search efficiency in a joint search task",
abstract = "Numerous factors impact attentional allocation, with behaviour being strongly influenced by the interaction between individual intent and our visual environment. Traditionally, visual search efficiency has been studied under solo search conditions. Here, we propose a novel joint search paradigm where one individual controls the visual input available to another individual via a gaze contingent window (e.g., Participant 1 controls the window with their eye movements and Participant 2–in an adjoining room–sees only stimuli that Participant 1 is fixating and responds to the target accordingly). Pairs of participants completed three blocks of a detection task that required them to: (1) search and detect the target individually, (2) search the display while their partner performed the detection task, or (3) detect while their partner searched. Search was most accurate when the person detecting was doing so for the second time while the person controlling the visual input was doing so for the first time, even when compared to participants with advanced solo or joint task experience (Experiments 2 and 3). Through surrendering control of one’s search strategy, we posit that there is a benefit of a reduced working memory load for the detector resulting in more accurate search. This paradigm creates a counterintuitive speed/accuracy trade-off which combines the heightened ability that comes from task experience (discrimination task) with the slower performance times associated with a novel task (the initial search) to create a potentially more efficient method of visual search.",
keywords = "Visual attention, visual search, working memory",
author = "McDonnell, {Gerald P.} and Mark Mills and Marshall, {Jordan E.} and Zosky, {Joshua E.} and Dodd, {Michael D.}",
year = "2018",
month = "2",
day = "7",
doi = "10.1080/13506285.2017.1386748",
language = "English (US)",
volume = "26",
pages = "71--88",
journal = "Visual Cognition",
issn = "1350-6285",
publisher = "Psychology Press Ltd",
number = "2",

}

TY - JOUR

T1 - You detect while I search

T2 - examining visual search efficiency in a joint search task

AU - McDonnell, Gerald P.

AU - Mills, Mark

AU - Marshall, Jordan E.

AU - Zosky, Joshua E.

AU - Dodd, Michael D.

PY - 2018/2/7

Y1 - 2018/2/7

N2 - Numerous factors impact attentional allocation, with behaviour being strongly influenced by the interaction between individual intent and our visual environment. Traditionally, visual search efficiency has been studied under solo search conditions. Here, we propose a novel joint search paradigm where one individual controls the visual input available to another individual via a gaze contingent window (e.g., Participant 1 controls the window with their eye movements and Participant 2–in an adjoining room–sees only stimuli that Participant 1 is fixating and responds to the target accordingly). Pairs of participants completed three blocks of a detection task that required them to: (1) search and detect the target individually, (2) search the display while their partner performed the detection task, or (3) detect while their partner searched. Search was most accurate when the person detecting was doing so for the second time while the person controlling the visual input was doing so for the first time, even when compared to participants with advanced solo or joint task experience (Experiments 2 and 3). Through surrendering control of one’s search strategy, we posit that there is a benefit of a reduced working memory load for the detector resulting in more accurate search. This paradigm creates a counterintuitive speed/accuracy trade-off which combines the heightened ability that comes from task experience (discrimination task) with the slower performance times associated with a novel task (the initial search) to create a potentially more efficient method of visual search.

AB - Numerous factors impact attentional allocation, with behaviour being strongly influenced by the interaction between individual intent and our visual environment. Traditionally, visual search efficiency has been studied under solo search conditions. Here, we propose a novel joint search paradigm where one individual controls the visual input available to another individual via a gaze contingent window (e.g., Participant 1 controls the window with their eye movements and Participant 2–in an adjoining room–sees only stimuli that Participant 1 is fixating and responds to the target accordingly). Pairs of participants completed three blocks of a detection task that required them to: (1) search and detect the target individually, (2) search the display while their partner performed the detection task, or (3) detect while their partner searched. Search was most accurate when the person detecting was doing so for the second time while the person controlling the visual input was doing so for the first time, even when compared to participants with advanced solo or joint task experience (Experiments 2 and 3). Through surrendering control of one’s search strategy, we posit that there is a benefit of a reduced working memory load for the detector resulting in more accurate search. This paradigm creates a counterintuitive speed/accuracy trade-off which combines the heightened ability that comes from task experience (discrimination task) with the slower performance times associated with a novel task (the initial search) to create a potentially more efficient method of visual search.

KW - Visual attention

KW - visual search

KW - working memory

UR - http://www.scopus.com/inward/record.url?scp=85032223497&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85032223497&partnerID=8YFLogxK

U2 - 10.1080/13506285.2017.1386748

DO - 10.1080/13506285.2017.1386748

M3 - Article

AN - SCOPUS:85032223497

VL - 26

SP - 71

EP - 88

JO - Visual Cognition

JF - Visual Cognition

SN - 1350-6285

IS - 2

ER -