A virtual reality experiment for improving the navigational recall: What can we learn from eye movements of high- and low-performing individuals?
OPEN ACCESS
Loading...
Author / Producer
Date
2018-01-14
Publication Type
Conference Paper
ETH Bibliography
yes
Citations
Altmetric
OPEN ACCESS
Data
Abstract
In its broader scope, this paper is concerned about understanding how (visualization) designs of virtual environments (VE) interact with navigational memory. We optimized the design of a VE for route learning following specific visualization guidelines that we derived from previous literature, and tested it with a typical navigational recall task with 42 participants. Recall accuracies of our participants widely vary. We hypothesize that by analyzing the eye movements of high- and low-performing participants in a comparative manner, we can better understand this variability, and identify if these two groups rely on different visual strategies. Such efforts inform the visualization designs, and in turn, these designs can better assist people. Those who perform poorly in navigational tasks for reasons such as lack of training or differences in visuospatial abilities might especially benefit from such assistance. In this paper, we present our concept for a work-in-progress study and provide the relevant background.
Permanent link
Publication status
published
External links
Book title
Eye Tracking for Spatial Research, Proceedings of the 3rd International Workshop
Journal / series
Volume
Pages / Article No.
Publisher
ETH Zurich
Event
3rd International Workshop on Eye Tracking for Spatial Research
Edition / version
Methods
Software
Geographic location
Date collected
Date created
Subject
navigation; virtual environments; visual strategies