A Gaze Gesture-Based Paradigm for Situational Impairments, Accessibility, and Rich Interactions
Conference Paper
Overview
Research
Identity
Additional Document Info
Other
View All
Overview
abstract
2018 Copyright held by the owner/author(s). Gaze gesture-based interactions on a computer are promising, but the existing systems are limited by the number of supported gestures, recognition accuracy, need to remember the stroke order, lack of extensibility, and so on. We present a gaze gesture-based interaction framework where a user can design gestures and associate them to appropriate commands like minimize, maximize, scroll, and so on. This allows the user to interact with a wide range of applications using a common set of gestures. Furthermore, our gesture recognition algorithm is independent of the screen size, resolution, and the user can draw the gesture anywhere on the target application. Results from a user study involving seven participants showed that the system recognizes a set of nine gestures with an accuracy of 93% and a F-measure of 0.96. We envision, this framework can be leveraged in developing solutions for situational impairments, accessibility, and also for implementing rich a interaction paradigm.
name of conference
Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications