MS Project Defense - Emily Fujimoto

Tuesday, July 29, 2014 - 2:00pm
Harold Frank Hall 1132
Non-Visual Navigation Using Combined Audio Music and Haptic Cues
Matthew Turk (Chair) and Tobias Hollerer


While a great deal of work has been done exploring non- visual navigation interfaces using audio and haptic cues, little work has been done combining the two modalities. We investigate combining different state-of-the-art interfaces for communicating direction and distance information using vibrotactile and audio music cues, limiting ourselves to interfaces that are possible with current off-the-shelf smart- phones. We use experimental logs, subjective task load questionnaires, and user comments to see how users’ perceived performance, objective performance, and acceptance of the system varied for different combinations. Users’ perceived performance did not differ much between the unimodal and multimodal interfaces, but a few users commented that the multimodal interfaces added some cognitive load. Objective performance showed that some multimodal combinations resulted in significantly less direction or distance error over some of the unimodal ones, especially the purely haptic interface. Based on these findings we propose a few design considerations for multimodal haptic/audio navigation interfaces.

Everyone welcome!