Research

Memory formation during natural behavior

Memory underpins many essential behaviors, from social communication to navigation, and is impaired in numerous disorders. Yet despite decades of research into its neural basis, there is still little consensus about which processes are engaged in vivo in natural settings and how these act to store realistic complex information structures, such as the arc of a conversation or a path through a new part of town.

I address this problem in fruit fly courtship, a natural social behavior in which the male sings a long, richly patterned acoustic song to court the female. In collaboration with the Murthy, Pillow, and Bialek groups at Princeton, I develop computational models that bridge neural and behavioral data to understand how the female neurally encodes, stores in memory, and responds to the entire history of the male’s song, which changes over tens of milliseconds but can last for up to many minutes. This work suggests a simple, biologically plausible solution: the female continually transforms song via a bank of heterogeneous adaptation process, then integrates the results into persistent neural activity, enabling the online formation of a multi-dimensional song representation with rich mnemonic information. Under this model naturalistic song evokes a slow, Brownian-like evolution of the female’s internal neural state, continually increasing song resolution and producing a reservoir of internal timescales that can be sampled to drive multi-scale behavioral modulation. This reveals both a more refined understanding of neural auditory processing in fly courtship and a possible general, scalable neuromorphic memory mechanism for remembering and processing extended streaming signals.

A second question, essential to human behaviors from learning to witness testimony, is how naturalistic memories are reconstructed. Working with Stefano Recanatesi at the University of Washington, I explore how the topological structure of pre-existing neural state spaces can be exploited to reconstruct sequentially organized memories, which in turn enables them to be stored with fast, simple, non-Hebbian plasticity mechanisms. One such state space is that of an internal world model, thought to play a central role in inference and prediction. Our central result is that if the world model state is transformed into a high-dimensional neural code via a fixed pattern-separating network, such as those involved in olfactory coding, complex naturalistic episodes can be represented as paths through the world model state space and stored in memory simply by each neuron integrating its own activity–a simple, one-factor plasticity rule. Subsequently, the episode can be reconstructed by applying an odor-tracking algorithm on the world model state space to retrace the path stored in the integrated neural activity. This suggests a parsimonious, expressive mechanism for storing complex episodic memories that obviates the need for coincidence detection, and thus may be simpler, faster, and more robust cellular processes. Moreover, we posit that episodic memory may have evolved in part via the repurposing of olfactory and odor-tracking neural circuitry, which has important implications for how we reconstruct memories in everyday life.

Network dynamics and control

Animals and humans can learn highly precise motor behaviors, such as music production or complex sporting maneuvers. In general, producing these behaviors requires tight temporal coordination among multiple motor signals, but we lack a thorough understanding of how this is achieved in the face of noise. Working with Adrienne Fairhall at the University of Washington, I have developed a songbird-inspired model addressing how activity sequences generated in noisy neural networks can unfold over long timescales without accumulating timing errors. We show, at both a mechanistic and theoretical level, how a one-dimensional thalamic-like input to a sequence-generating motor area can quench timing errors, provided the input’s spatial structure reflects its timecourse. This mechanism requires no feedback from or direct interaction among motor subprocesses to keep them coordinated, and instead reflects a purely feed-forward control process, revealing a simple “first-order” model for precision timing control. Intuitively, thalamus could act primarily as a conductor to coordinate multiple downstream motor processes over the long timescales of complex learned skills, with feedback playing more of an auxiliary role.

Finally, I am interested in how chaotic dynamics in random networks could subserve working memory and information processing. My work in this area suggests that when the units of a random network are multi-dimensional, akin to small neural assemblies with a divisive normalization nonlinearity, chaotic activity can stably persist across flexible subsets of these dimensions, thereby creating a highly microscopically dynamic yet macroscopically stable substrate for combinatorially expressive working memory. By modifying the variances of synaptic strengths between different neuron groups, one can moreover program complex, slowly evolving sequential computations mediated by dynamic variances, rather than means, of input currents to individual neurons. This causes neurons to consistently produce Poisson-like spike trains at all firing rates during general computations, a core feature of neurons in many brain areas yet which has historically been challenging to model without separate noise processes. This sheds light on how irregular neural dynamics during memory tasks can retain and transform flexible information stably enough to robustly support combinatorially organized behaviors.

Online talks

Path vectors: a simple neural code for flexible sequential memory (1:16:12)

Working memory for images atop irregular neural dynamics (13:37)