Date of Award

2015

Degree Type

Dissertation

Degree Name

Doctor of Philosophy (PhD)

Graduate Group

Neuroscience

First Advisor

Yale E. Cohen

Abstract

Perceptual representations of auditory stimuli—which are called auditory streams or objects—are derived from the auditory system's ability to segregate and group stimuli based upon spectral, temporal, and spatial features. However, it remains unclear how our auditory system encodes these auditory streams at the level of the single neuron. In order to address this question directly, we first validated an animal model of auditory streaming. Specifically, we trained rhesus macaques to report their streaming percept using methodologies and controls similar to those presented in previous human studies. We found that the monkeys' behavioral reports were qualitatively consistent with those of human listeners. Next, we recorded from neurons in the primary auditory cortex while monkeys simultaneously reported their streaming percepts. We found that A1 neurons had frequency-tuned responses that habituated, independent of frequency content, as the auditory sequence unfolded over time; and we report for the first time that firing rate of A1 neurons was modulated by the monkeys’ choices. This modulation increased with listening time and was independent of the frequency difference between consecutive tone bursts. Overall, our results suggest that A1 activity contributes to the sensory evidence underlying the segregation and grouping of acoustic stimuli into distinct auditory streams. However, because we observe choice-related activity based upon firing rate alone, our data are at partially at odds with Micheyl et al.’s (2005) prominent hypothesis, which argued that frequency-dependent habituation may be a coding mechanism for the streaming percept.

Available for download on Friday, October 12, 2018

Files over 3MB may be slow to open. For best results, right-click and select "save as..."

Share

COinS