COMPUTATION THROUGH NOISY DYNAMICS

Loading...
Thumbnail Image
Degree type
Doctor of Philosophy (PhD)
Graduate group
Neuroscience
Discipline
Neuroscience and Neurobiology
Subject
Computational Modelling
Dynamical Systems
Human Behavior
Funder
Grant number
License
Copyright date
2022
Distributor
Related resources
Author
Brennan, Connor
Contributor
Abstract

Biological systems are unique in the known universe for their ability to integrate environmental information to produce adaptive behavior. In vertebrates this integration is mediated through the activity of many interconnected neurons. Recent advancements in imaging technology have made it possible to simultaneously record most neurons in simple animals. However, even in the extreme case where all pertinent information about the system is known it is not clear how to translate the microscopic biophysical details to the macroscopic algorithms used to integrate information. Here, we will build upon the framework of computation through dynamics to elucidate the algorithms used in biological systems. Chapter 1, will introduce computation through dynamics with an emphasis on the application of dynamical systems theory to neuronal systems. The primary advancement of this work is that we consider systems constrained by noisy transmission of information between neuronal units. Random fluctuations cause the information stored in the network to degrade over time. It has been hypothesized that in order to counteract these fluctuations networks will be forced to utilized discrete attractor dynamics. The dynamics of such systems manifest as one-dimensional trajectories that stabilize local perturbations in all directions other than the direction of flow. Each trajectory encodes the task relevant inputs that the system has received. Switches between these trajectories represent reactions to environmental stimuli, or "forgetting" of stored information.

In Chapter 2, we make use of whole brain imaging with single neuron resolution in \textit{C. elegans}. We show how such recordings can model the nervous system at the behaviorally-relevant scale. Our model predicts switches in locomotion up to 30 seconds prior to the event. These predictions are valid for individuals not used in model construction. The model also predicts behavioral dwell time statistics, sequences of behaviors, and neuronal activation. Remarkably, our model succeeds despite consistent inter-individual differences in neuronal activation. Thus, our analytical framework reconciles consistent individual differences in neuronal activation with global dynamics that operate universally across individuals.

In Chapter 3, we generalize the method used in Chapter 2 and validate it on diverse systems and scales from single neurons in C. elegans to fMRI in humans. Despite their simplicity, these models accurately predict future neuronal activity and future decisions made by human participants. We use the models to compare the computational strategy of primates and artificial systems trained on the same task to identify specific conditions under which the artificial agent learns the same strategy as the primate. The computational strategy extracted using our methodology predicts specific errors on novel stimuli. These results show that our methodology is a powerful tool for studying the relationship between computation and neuronal activity across diverse systems.

In Chapter 4, we explicitly test our assumptions about the effects of noise using human working memory. Working memory is thought to be stored by discrete attractors dynamics, which stabilize memories against noise but introduce biases. Most studies of attractor dynamics in working memory focused on only a few timescales. Thus, it is unclear whether discrete attractor dynamics can successfully model working memory across all relevant timescales. Here, we show in a model free fashion that discrete attractor dynamics fail to generalize across multiple timescales in a human visual working memory task. We introduce a novel model that combines discrete attractor dynamics with plasticity. This model successfully generalizes across timescales and correctly predicts intertrial interactions. In this combined model plasticity mitigates the effect of noise, while discrete attractors dynamics introduce biases. Thus, discrete attractor dynamics alone are insufficient to model working memory, and their presence cannot be fully explained by their noise-stabilizing properties.
Advisor
Proekt, Alex
Date of degree
2022
Date Range for Data Collection (Start Date)
Date Range for Data Collection (End Date)
Digital Object Identifier
Series name and number
Volume number
Issue number
Publisher
Publisher DOI
Journal Issue
Comments
Recommended citation