Tactons in multitask environments: The interaction of presentation modality and processing code Conference Paper uri icon

abstract

  • Tactons, or vibrotactile icons, have been proposed as a means to communicate complex concepts to users and to support multitasking in environments involving numerous visual and/or auditory tasks and stimuli. This study investigated the role of processing code in the interpretation of tactons while performing concurrent visual tasks in such environments. Participants decoded tactons composed of spatiotemporal patterns of vibrations requiring spatial processing and interpreted one of two types of visual task stimuli requiring either spatial or categorical processing in a driving simulation. Compared to single-task performance, there was a significantly larger dual-task performance decrement when the tacton task was paired with the visual task requiring spatial (as compared to categorical) processing. The findings are consistent with the assertion of Multiple Resource Theory that interference between concurrent tasks is greater when these tasks involve the same processing code. They illustrate how distributing task-related information across modalities is beneficial but not sufficient to avoid task interference. A direct implication of the findings is to avoid the use of spatiotemporal tactons in environments which rely heavily on spatial processing resources, such as car cockpits or flight decks.

published proceedings

  • Proceedings of the Human Factors and Ergonomics Society Annual Meeting

author list (cited authors)

  • Ferris, T., Hameed, S., Penfold, R., & Rao, N.

citation count

  • 3

complete list of authors

  • Ferris, Thomas||Hameed, Shameem||Penfold, Robert||Rao, Nikhil

publication date

  • October 2007