A gaze gesture-based paradigm for situational impairments, accessibility, and rich interactions Conference Paper uri icon

abstract

  • © 2018 Copyright held by the owner/author(s). Gaze gesture-based interactions on a computer are promising, but the existing systems are limited by the number of supported gestures, recognition accuracy, need to remember the stroke order, lack of extensibility, and so on. We present a gaze gesture-based interaction framework where a user can design gestures and associate them to appropriate commands like minimize, maximize, scroll, and so on. This allows the user to interact with a wide range of applications using a common set of gestures. Furthermore, our gesture recognition algorithm is independent of the screen size, resolution, and the user can draw the gesture anywhere on the target application. Results from a user study involving seven participants showed that the system recognizes a set of nine gestures with an accuracy of 93% and a F-measure of 0.96. We envision, this framework can be leveraged in developing solutions for situational impairments, accessibility, and also for implementing rich a interaction paradigm.

name of conference

  • ETRA '18: 2018 Symposium on Eye Tracking Research and Applications

published proceedings

  • Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications

author list (cited authors)

  • Rajanna, V., & Hammond, T.

citation count

  • 1

complete list of authors

  • Rajanna, Vijay||Hammond, Tracy

publication date

  • June 2018

publisher

  • ACM  Publisher