GAWSCHI: Gaze-Augmented, Wearable-Supplemented Computer-Human Interaction Conference Paper uri icon

abstract

  • 2016 Copyright held by the owner/author(s). Publication rights licensed to ACM. Recent developments in eye tracking technology are paving the way for gaze-driven interaction as the primary interaction modality. Despite successful efforts, existing solutions to the "Midas Touch" problem have two inherent issues: 1) lower accuracy, and 2) visual fatigue that are yet to be addressed. In this work we present GAWSCHI: a Gaze-Augmented, Wearable-Supplemented Computer-Human Interaction framework that enables accurate and quick gaze-driven interactions, while being completely immersive and hands-free. GAWSCHI uses an eye tracker and a wearable device (quasi-mouse) that is operated with the user's foot, specifically the big toe. The system was evaluated with a comparative user study involving 30 participants, with each participant performing eleven predefined interaction tasks (on MS Windows 10) using both mouse and gaze-driven interactions. We found that gazedriven interaction using GAWSCHI is as good (time and precision) as mouse-based interaction as long as the dimensions of the interface element are above a threshold (0.60" x 0.51"). In addition, an analysis of NASA Task Load Index post-study survey showed that the participants experienced low mental, physical, and temporal demand; also achieved a high performance. We foresee GAWSCHI as the primary interaction modality for the physically challenged and a means of enriched interaction modality for the able-bodied demographics.

name of conference

  • Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications

published proceedings

  • 2016 ACM SYMPOSIUM ON EYE TRACKING RESEARCH & APPLICATIONS (ETRA 2016)

author list (cited authors)

  • Rajanna, V., & Hammond, T.

citation count

  • 13

complete list of authors

  • Rajanna, Vijay||Hammond, Tracy

publication date

  • March 2016