A virtual reality experiment for improving the navigational recall: What can we learn from eye movements of high- and low-performing individuals?

Open access
Date
2018-01-14Type
- Conference Paper
ETH Bibliography
yes
Altmetrics
Abstract
In its broader scope, this paper is concerned about understanding how (visualization) designs of virtual environments (VE) interact with navigational memory. We optimized the design of a VE for route learning following specific visualization guidelines that we derived from previous literature, and tested it with a typical navigational recall task with 42 participants. Recall accuracies of our participants widely vary. We hypothesize that by analyzing the eye movements of high- and low-performing participants in a comparative manner, we can better understand this variability, and identify if these two groups rely on different visual strategies. Such efforts inform the visualization designs, and in turn, these designs can better assist people. Those who perform poorly in navigational tasks for reasons such as lack of training or differences in visuospatial abilities might especially benefit from such assistance. In this paper, we present our concept for a work-in-progress study and provide the relevant background. Show more
Permanent link
https://doi.org/10.3929/ethz-b-000222473Publication status
publishedBook title
Eye Tracking for Spatial Research, Proceedings of the 3rd International WorkshopPublisher
ETH ZurichEvent
Subject
navigation; virtual environments; visual strategiesMore
Show all metadata
ETH Bibliography
yes
Altmetrics