HCC: Small: STAAR: Spatial Touch Audio Annotator and Reader for Individuals with Blindness or Severe Visual Impairment Grant uri icon

abstract

  • The PI''s goal in this research is to develop tools on a state-of-the-art platform (the Apple iPad) that will afford access to textual information for individuals who are blind or who suffer from severe visual impairments (IBSVI). The PI''s approach is to use an embossed screen overlay to provide spatial and tactile correlates to text read aloud, and to engage the spatial cognition and memory resources of the target population for navigating through a document and annotating it if/as desired. The PI argues that from the invention of print media forward, information has been formulated and optimized for consumption by beings (people) with a dominant visual capability. This visuo-spatial bias is not well-understood or studied in the context of information access by and delivery to the IBSVI community; most technological information aids funnel information to them as sequential aural streams, obviating the use of broader spatial cognitive resources. In this project the PI will develop a Spatial Touch Audio Annotator and Reader (STAAR) testbed to explore a multimodal alternative that enables the user to fuse spatial layout and informational content through touch location on a slate-type device and audio rendering of text to speech, respectively. STAAR will enable self-paced reading using a tactile overlay pattern on an iPad surface, which will be designed to provide tactile landmarks to help the user navigate the "page." STAAR will render the text chunk touched audibly. The use of touch gestures to enable contextualized highlighting and note-taking will also be investigated. The PI will study how the target population may employ spatial strategies and exploration to re-find and re-access information both in the act of reading and for recall after some time interval.Broader Impacts: A new generation of slate-type devices exemplified by the Apple iPad threatens to widen the accessibility gap between the IBSVI community and the majority of the population. By supporting the use of spatial cognitive and memory resource for both reading and contextualized annotation, the PI hopes to ameliorate this endemic barrier to participation. Project outcomes will contribute to our understanding of the role of space in information design and representation, and the spatial cognition and memory resources needed for uptake of such information. And they will contribute to the domain of mobile computing, for the IBSVI community in particular but ultimately for the population in general, through the STAAR system, which will be designed and implemented from the ground up as a portable device on a state-of-art interaction form-factor. The multimodal fusion of haptics and speech in STAAR will have implications for the designs of such things as navigational aids and service delivery systems for the non-sighted as well as the sighted. The project will in addition provide unique cross-disciplinary educational and learning opportunities for undergraduate and graduate students. All software developed in this project will be placed in open source.

date/time interval

  • 2013 - 2018