Computational Principles Underlying Contextual Modulations in Visual Perception

Loading...
Thumbnail Image
Degree type
Doctor of Philosophy (PhD)
Graduate group
Psychology
Discipline
Psychiatry and Psychology
Subject
Bayesian inference
Categorical perception
Computational modeling
Orientaiton perception
Sensory Adaptation
Visual perception
Funder
Grant number
License
Copyright date
2023
Distributor
Related resources
Author
Mao, Jiang
Contributor
Abstract

Perception is constantly modulated by context, including temporal context, spatial context, structural context, etc. The same stimulus may be perceived differently under different contexts. Studying the contextual modulation of perception may provides deep insight into the computational principles of the sensory system. The present thesis sheds light on the computational principles underlying these contextual modulation.In Chapter 2, I test the efficient coding principle during sensory adaptation (temporal context). I measure the orientation discrimination threshold in human observers and extract the difference in coding accuracy under different adaptation states. By comparing the extracted coding accuracy with the image statistics analyzed from the retinal input of freely behaving human subjects under natural conditions and the Fisher information in a recurrent neural network trained on natural scene videos to predict the next frame while performing the same task as human subjects, I provide evidence for the efficient coding explanation of the adaptation effect, namely, adaptation to the recent history of sensory input establishes efficient sensory representations for the next expected sensory input. In Chapter 3, I present results on the structural context in visual orientation perception. I propose a holistic matching model that assumes perception is a holistic inference process that simultaneously operates at all levels of the representational hierarchy. Validation against multiple existing psychophysical datasets and data from a new psychophysical experiment demonstrates that compared to previous models, our model provides a quantitatively accurate and detailed description of subjects’ behavior, which includes categorical contextual effects that previous models have failed to even qualitatively account for. I also show that the model generalizes to other features and thus offers a universal explanation for categorical contextual modulation in low-level sensory perception. Together, this thesis advances our understanding of visual perception under contextual modulations and provides insight into the underlying computational principles from a normative perspective on both the encoding and decoding process of perception.

Advisor
Stocker, Alan, A
Date of degree
2023
Date Range for Data Collection (Start Date)
Date Range for Data Collection (End Date)
Digital Object Identifier
Series name and number
Volume number
Issue number
Publisher
Publisher DOI
Journal Issue
Comments
Recommended citation