The effects of spatial auditory and visual cues on mixed reality remote collaboration
Metadata only
Author
Show all
Date
2020-12Type
- Journal Article
Citations
Cited 8 times in
Web of Science
Cited 22 times in
Scopus
ETH Bibliography
yes
Altmetrics
Abstract
Collaborative Mixed Reality (MR) technologies enable remote people to work together by sharing communication cues intrinsic to face-to-face conversations, such as eye gaze and hand gestures. While the role of visual cues has been investigated in many collaborative MR systems, the use of spatial auditory cues remains underexplored. In this paper, we present an MR remote collaboration system that shares both spatial auditory and visual cues between collaborators to help them complete a search task. Through two user studies in a large office, we found that compared to non-spatialized audio, the spatialized remote expert’s voice and auditory beacons enabled local workers to find small occluded objects with significantly stronger spatial perception. We also found that while the spatial auditory cues could indicate the spatial layout and a general direction to search for the target object, visual head frustum and hand gestures intuitively demonstrated the remote expert’s movements and the position of the target. Integrating visual cues (especially the head frustum) with the spatial auditory cues significantly improved the local worker’s task performance, social presence, and spatial perception of the environment. © 2020, Springer Nature Switzerland AG. Show more
Publication status
publishedExternal links
Journal / series
Journal on Multimodal User InterfacesVolume
Pages / Article No.
Publisher
SpringerSubject
Mixed reality; Augmented reality; Virtual reality; Remote collaboration; Spatial audio; Hand gestureOrganisational unit
03528 - Mattern, Friedemann (emeritus) / Mattern, Friedemann (emeritus)
More
Show all metadata
Citations
Cited 8 times in
Web of Science
Cited 22 times in
Scopus
ETH Bibliography
yes
Altmetrics