Human gaze commands classification

A shape based approach to interfacing with robots

Trevor Lynn Craig, Carl A Nelson, Songpo Li, Xiaoli Zhang

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Citation (Scopus)

Abstract

The sense of sight is one of the main outlets to how we interact with the world around us. Using eye tracking methods, this sensory input channel may also be used as an output channel to provide commands for robots to follow. These gaze-commanded robots could then be used to assist severely mobility-limited individuals in the home or similar environments. This paper explores the use of visually drawn shapes as the input for robot commands. These commands were recorded using low-cost gaze tracking hardware (Gazepoint GP3 Eye Tracker). The data were then processed using a custom algorithm in MATLAB to detect commands to be passed to a small humanoid robot (NAO). Using the techniques and procedures given in this paper, people with limited mobility will be able to input shape commands to have robots like NAO react as personal assistants. This is also extensible to gaze-based human-machine interfaces in general for a variety of applications.

Original languageEnglish (US)
Title of host publicationMESA 2016 - 12th IEEE/ASME International Conference on Mechatronic and Embedded Systems and Applications - Conference Proceedings
PublisherInstitute of Electrical and Electronics Engineers Inc.
ISBN (Electronic)9781509061907
DOIs
StatePublished - Oct 7 2016
Event12th IEEE/ASME International Conference on Mechatronic and Embedded Systems and Applications, MESA 2016 - Auckland, New Zealand
Duration: Aug 29 2016Aug 31 2016

Other

Other12th IEEE/ASME International Conference on Mechatronic and Embedded Systems and Applications, MESA 2016
CountryNew Zealand
CityAuckland
Period8/29/168/31/16

Fingerprint

Robot
Robots
Human-machine Interface
Eye Tracking
Humanoid Robot
MATLAB
Hardware
Human
Output
Costs

Keywords

  • Assistive Robot
  • Gaze
  • Gaze Tracking
  • Gaze-based Control
  • Human-Robot Interaction
  • Shape Recognition

ASJC Scopus subject areas

  • Biomedical Engineering
  • Industrial and Manufacturing Engineering
  • Mechanical Engineering
  • Control and Optimization

Cite this

Craig, T. L., Nelson, C. A., Li, S., & Zhang, X. (2016). Human gaze commands classification: A shape based approach to interfacing with robots. In MESA 2016 - 12th IEEE/ASME International Conference on Mechatronic and Embedded Systems and Applications - Conference Proceedings [7587154] Institute of Electrical and Electronics Engineers Inc.. https://doi.org/10.1109/MESA.2016.7587154

Human gaze commands classification : A shape based approach to interfacing with robots. / Craig, Trevor Lynn; Nelson, Carl A; Li, Songpo; Zhang, Xiaoli.

MESA 2016 - 12th IEEE/ASME International Conference on Mechatronic and Embedded Systems and Applications - Conference Proceedings. Institute of Electrical and Electronics Engineers Inc., 2016. 7587154.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Craig, TL, Nelson, CA, Li, S & Zhang, X 2016, Human gaze commands classification: A shape based approach to interfacing with robots. in MESA 2016 - 12th IEEE/ASME International Conference on Mechatronic and Embedded Systems and Applications - Conference Proceedings., 7587154, Institute of Electrical and Electronics Engineers Inc., 12th IEEE/ASME International Conference on Mechatronic and Embedded Systems and Applications, MESA 2016, Auckland, New Zealand, 8/29/16. https://doi.org/10.1109/MESA.2016.7587154
Craig TL, Nelson CA, Li S, Zhang X. Human gaze commands classification: A shape based approach to interfacing with robots. In MESA 2016 - 12th IEEE/ASME International Conference on Mechatronic and Embedded Systems and Applications - Conference Proceedings. Institute of Electrical and Electronics Engineers Inc. 2016. 7587154 https://doi.org/10.1109/MESA.2016.7587154
Craig, Trevor Lynn ; Nelson, Carl A ; Li, Songpo ; Zhang, Xiaoli. / Human gaze commands classification : A shape based approach to interfacing with robots. MESA 2016 - 12th IEEE/ASME International Conference on Mechatronic and Embedded Systems and Applications - Conference Proceedings. Institute of Electrical and Electronics Engineers Inc., 2016.
@inproceedings{2b5e65e699d4447b91bf1631df045a65,
title = "Human gaze commands classification: A shape based approach to interfacing with robots",
abstract = "The sense of sight is one of the main outlets to how we interact with the world around us. Using eye tracking methods, this sensory input channel may also be used as an output channel to provide commands for robots to follow. These gaze-commanded robots could then be used to assist severely mobility-limited individuals in the home or similar environments. This paper explores the use of visually drawn shapes as the input for robot commands. These commands were recorded using low-cost gaze tracking hardware (Gazepoint GP3 Eye Tracker). The data were then processed using a custom algorithm in MATLAB to detect commands to be passed to a small humanoid robot (NAO). Using the techniques and procedures given in this paper, people with limited mobility will be able to input shape commands to have robots like NAO react as personal assistants. This is also extensible to gaze-based human-machine interfaces in general for a variety of applications.",
keywords = "Assistive Robot, Gaze, Gaze Tracking, Gaze-based Control, Human-Robot Interaction, Shape Recognition",
author = "Craig, {Trevor Lynn} and Nelson, {Carl A} and Songpo Li and Xiaoli Zhang",
year = "2016",
month = "10",
day = "7",
doi = "10.1109/MESA.2016.7587154",
language = "English (US)",
booktitle = "MESA 2016 - 12th IEEE/ASME International Conference on Mechatronic and Embedded Systems and Applications - Conference Proceedings",
publisher = "Institute of Electrical and Electronics Engineers Inc.",

}

TY - GEN

T1 - Human gaze commands classification

T2 - A shape based approach to interfacing with robots

AU - Craig, Trevor Lynn

AU - Nelson, Carl A

AU - Li, Songpo

AU - Zhang, Xiaoli

PY - 2016/10/7

Y1 - 2016/10/7

N2 - The sense of sight is one of the main outlets to how we interact with the world around us. Using eye tracking methods, this sensory input channel may also be used as an output channel to provide commands for robots to follow. These gaze-commanded robots could then be used to assist severely mobility-limited individuals in the home or similar environments. This paper explores the use of visually drawn shapes as the input for robot commands. These commands were recorded using low-cost gaze tracking hardware (Gazepoint GP3 Eye Tracker). The data were then processed using a custom algorithm in MATLAB to detect commands to be passed to a small humanoid robot (NAO). Using the techniques and procedures given in this paper, people with limited mobility will be able to input shape commands to have robots like NAO react as personal assistants. This is also extensible to gaze-based human-machine interfaces in general for a variety of applications.

AB - The sense of sight is one of the main outlets to how we interact with the world around us. Using eye tracking methods, this sensory input channel may also be used as an output channel to provide commands for robots to follow. These gaze-commanded robots could then be used to assist severely mobility-limited individuals in the home or similar environments. This paper explores the use of visually drawn shapes as the input for robot commands. These commands were recorded using low-cost gaze tracking hardware (Gazepoint GP3 Eye Tracker). The data were then processed using a custom algorithm in MATLAB to detect commands to be passed to a small humanoid robot (NAO). Using the techniques and procedures given in this paper, people with limited mobility will be able to input shape commands to have robots like NAO react as personal assistants. This is also extensible to gaze-based human-machine interfaces in general for a variety of applications.

KW - Assistive Robot

KW - Gaze

KW - Gaze Tracking

KW - Gaze-based Control

KW - Human-Robot Interaction

KW - Shape Recognition

UR - http://www.scopus.com/inward/record.url?scp=84994339207&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84994339207&partnerID=8YFLogxK

U2 - 10.1109/MESA.2016.7587154

DO - 10.1109/MESA.2016.7587154

M3 - Conference contribution

BT - MESA 2016 - 12th IEEE/ASME International Conference on Mechatronic and Embedded Systems and Applications - Conference Proceedings

PB - Institute of Electrical and Electronics Engineers Inc.

ER -