Tactons in multitask environments: The interaction of presentation modality and processing code
- Additional Document Info
- View All
Tactons, or vibrotactile icons, have been proposed as a means to communicate complex concepts to users and to support multitasking in environments involving numerous visual and/or auditory tasks and stimuli. This study investigated the role of processing code in the interpretation of tactons while performing concurrent visual tasks in such environments. Participants decoded tactons composed of spatiotemporal patterns of vibrations - requiring spatial processing - and interpreted one of two types of visual task stimuli - requiring either spatial or categorical processing - in a driving simulation. Compared to singletask performance, there was a significantly larger dual-task performance decrement when the tacton task was paired with the visual task requiring spatial (as compared to categorical) processing. The findings are consistent with the assertion of Multiple Resource Theory that interference between concurrent tasks is greater when these tasks involve the same processing code. They illustrate how distributing task-related information across modalities is beneficial but not sufficient to avoid task interference. A direct implication of the findings is to avoid the use of spatiotemporal tactons in environments which rely heavily on spatial processing resources, such as car cockpits or flight decks.
author list (cited authors)
Ferris, T., Hameed, S., Penfold, R., & Rao, N.