VisualEYEze: A web-based solution for receiving feedback on artwork through eye tracking Conference Paper uri icon


  • 2018 Copyright for the individual papers remains with the authors. Artists value the ability to determine what parts of their composition is most appreciated by viewers. This information normally comes straight from viewers in the form of oral and written feedback; however, due to the lack of participation on the viewers part and because much of our visual understanding of artwork can be subconscious and difficult to express verbally, the value of this feedback is limited. Eye tracking technology has been used before to analyze artwork, however, most of this work has been performed in a controlled lab setting and as such this technology remains largely inaccessible to individual artists who may seek feedback. To address this issue, we developed a web-based system where artists can upload their artwork to be viewed by the viewers on their computer while a web camera tracks their eye movements. The artist receives feedback in the form of visualized eye tracking data that depicts what areas on the image looked at the most by viewers. We evaluated our system by having 5 artists upload a total of 17 images, which were subsequently viewed by 20 users. The artists expressed that seeing eye tracking data visualized on their artwork indicating the areas of interest is a unique way of receiving feedback and is highly useful. Also, they felt that the platform makes the artists more aware of their compositions; something that can especially help inexperienced artists. Furthermore, 90% of the viewers expressed that they were comfortable in providing eye movement data as a form of feedback to the artists.

published proceedings

  • CEUR Workshop Proceedings

author list (cited authors)

  • Bauman, B., Gunhouse, R., Jones, A., Da Silva, W., Sharar, S., Rajanna, V., ... Hammond, T.

complete list of authors

  • Bauman, B||Gunhouse, R||Jones, A||Da Silva, W||Sharar, S||Rajanna, V||Cherian, J||Koh, JI||Hammond, T

publication date

  • January 2018