Learning and consolidation across memory systems in brains and machines

Loading...
Thumbnail Image
Degree type
PhD
Graduate group
Psychology
Discipline
Psychiatry and Psychology
Neuroscience and Neurobiology
Data Science
Subject
Consolidation
Hippocampus
Learning
Memory
Neural Networks
Replay
Funder
Grant number
License
Copyright date
01/01/2025
Distributor
Related resources
Author
Zhou, Zhenglong
Contributor
Abstract

How does the brain learn gracefully in ever-changing environments? This ability requires continually integrating new information without overwriting existing memories while forming knowledge that generalizes to novel situations—abilities that still elude even the most advanced artificial systems. Classic models attribute this capacity to a division of labor between memory systems: the hippocampus rapidly encodes new experiences using pattern-separated representations that minimize interference and replays them offline in an interleaved fashion, allowing the neocortex to gradually extract structural regularities that support generalization. However, growing evidence challenges this simple dichotomy. Both the hippocampus and neocortex exhibit heterogeneous learning functions and dynamics. The hippocampus, traditionally associated with episodic encoding, also supports rapid generalization. Moreover, both structures show substantial variation in memory codes, plasticity, and functional specialization across subregions. Replay, too, displays dynamics that defy classic interpretations—it is far from a simple recapitulation of recent experience. This thesis aims to provide new ways of understanding memory systems and their interactions that account for these rich dynamics and heterogeneity. In Chapter 2, we show that, in a hippocampally dependent task, interleaved learning enables humans to rapidly form integrated, distributed representations that support generalization. This challenges the view that distributed representations only emerge slowly through cortical learning and offers a mechanistic account of the how the brain supports fast generalization. In Chapter 3, we show that when artificial neural networks incrementally learn a sequence of tasks, meta-learning of plasticity and sparsity enhances their computational efficiency and gives rise to heterogeneous learning systems characterized by a graded organization of plasticity and sparsity that mirrors patterns observed in the brain. This work suggests that distinctions across memory systems may themselves be learned and optimized for behavior. Finally, in Chapter 4, we propose a novel account of replay as a dynamic, context-guided mechanism for memory consolidation. This framework accounts for a wide range of replay phenomena, including replay's deviations from recent experience and the preferential reactivation of remote memories. Together, these findings offer new insight into how the brain orchestrates multiple memory codes and learning systems to support graceful learning over time.

Advisor
Schapiro, Anna
Date of degree
2025
Date Range for Data Collection (Start Date)
Date Range for Data Collection (End Date)
Digital Object Identifier
Series name and number
Volume number
Issue number
Publisher
Publisher DOI
Journal Issue
Comments
Recommended citation