Abstract
Visual perception is strongly shaped by expectations, but it is poorly understood how such perceptual expectations are learned in our dynamic sensory environment. Here, we applied a Bayesian framework to investigate whether perceptual expectations are continuously updated from different aspects of ongoing experience. In two experiments, human observers performed an associative learning task in which rapidly changing expectations about the appearance of ambiguous stimuli were induced. We found that perception of ambiguous stimuli was biased by both learned associations and previous perceptual outcomes. Computational modeling revealed that perception was best explained by a model that continuously updated priors from associative learning and perceptual history and combined these priors with the current sensory information in a probabilistic manner. Our findings suggest that the construction of visual perception is a highly dynamic process that incorporates rapidly changing expectations from different sources in a manner consistent with Bayesian learning and inference. Show more
Permanent link
https://doi.org/10.3929/ethz-b-000117239Publication status
publishedExternal links
Journal / series
Frontiers in Human NeuroscienceVolume
Pages / Article No.
Publisher
Frontiers Research FoundationSubject
Visual perception; Bayesian brain; Bistable perception; Associative learning; Sensory memory; Hierarchical Gaussian filterOrganisational unit
03955 - Stephan, Klaas E. / Stephan, Klaas E.
More
Show all metadata


