Computational Mechanisms Underlying Perception Of Visual Motion

Loading...
Thumbnail Image
Degree type
Doctor of Philosophy (PhD)
Graduate group
Psychology
Discipline
Subject
Computation
Perception
Vision
Psychology
Funder
Grant number
License
Copyright date
2022-10-05T20:22:00-07:00
Distributor
Related resources
Author
Chin, Benjamin Ming
Contributor
Abstract

Motion is a fundamental property estimated by human sensory-perception. When visual shapes and patterns change their positions over time, we perceive motion. Relating properties of perceived motion—speed and direction—to properties of visual stimuli is an important endeavor in vision science. Understanding this relationship requires an understanding of the computations performed by the visual system to extract motion information from visual stimuli. The present research sheds light on the nature of these computations. In the first study, human performance in a speed discrimination task with naturalistic stimuli is compared to performance of an ideal observer model. The ideal observer model utilizes computations that have been optimized for discriminating speed among a large training set of naturalistic stimuli. Although human performance falls short of ideal observer performance because of the presence of internal noise, the remarkable finding is that the computations performed minimize, to the maximum possible extent, the performance limits imposed by external stimulus variability. In other words, humans perform computations that are optimal. The second study focuses on how spatial frequency, a basic characteristic of visual patterns, impacts the process by which the visual system integrates motion across time (temporal integration). A continuous target-tracking task demonstrates that longer temporal integration periods are associated with higher spatial frequencies. This predicts a visual depth illusion when the left and right eyes are simultaneously presented stimuli having different spatial frequencies. A second experiment using traditional forced-choice psychophysics confirms this prediction. The third study explores how color impacts estimates of spatial position during motion. We parameterize color in terms of L-cone and S-cone activity modulations in the eye. Using the same continuous target-tracking paradigm from Chapter 2, we demonstrate that position estimates for stimuli comprised of pure S-cone modulations lag behind position estimates for stimuli comprised of pure L-cone modulations. A key finding is that when L-cone and S-cone modulations are combined, processing lag is almost exclusively determined by L-cone modulations.

Advisor
Johannes D. Burge
Date of degree
2022-01-01
Date Range for Data Collection (Start Date)
Date Range for Data Collection (End Date)
Digital Object Identifier
Series name and number
Volume number
Issue number
Publisher
Publisher DOI
Journal Issue
Comments
Recommended citation