Research
Neural computation in the asynchronous irregular regime
One of the most widely observed features of brain activity in awake mammals is that neurons spike irregularly. This has been observed for decades across many brain areas and conditions and is the central motivation behind modeling spike trains as a Poisson process. Yet we still don’t know why the brain operates in this curious regime. Largely, this is due to the tremendous success of artificial neural networks that do not spike, much less irregularly, together with significant mathematical hurdles in understanding how networks can compute with irregular spikes. In a recent study, however, I found that spiking networks in which neurons of different cell types compete in small groups can produce Poisson-like irregular activity strongly resembling mammalian brain activity, while also generating flexible macroscopic dynamics amenable to concise mathematical theory. The theory shows that the emergent macroscopic dynamics exhibit highly favorable properties, such as robust working memory at low firing rates or the ability to produce sequential activity that propagates at controllable speeds. In ongoing work I investigate how these networks can instantiate complex computations such as pattern completion or time-series transformations, and how to fit them to neural recordings to gain specific data-driven insights into neural computation through dynamics.
Episodic memory and learning
Much of our day-to-day learning occurs not through repeated trials typically studied in the laboratory, but after single experiences like a chat with a friend or a walk through a new part of town. These experiences, or episodes, often have rich patterning that we absorb almost immediately and subsequently influences our future cognitive and behavioral processes. But how physical processes in the brain support episodic learning is not understood. Recent work showed how episodic memory and learning could be supported by much simpler neural plasticity mechanisms than previously thought, by relying on pre-existing structures or priors over the neural activity evoked by such experiences. In particular, we found that episodes corresponding to paths through a wide variety of complex state spaces could be accurately represented using plasticity rules depending only on pre-synaptic activity (in contrast to the long-standing Hebbian theory of memory), such as those found in the hippocampal mossy fiber pathway, and subsequently reconstructed with an odor-tracking algorithm. A crucial consequence is that episodic memory reconstruction may strongly depend on one's priors---for instance, one's model of the world---which has wide implications for social institutions from witness testimony to psychotherapy. In ongoing work I investigate how the neural codes for such episodes can be used for complex cognitive computations, and how similar biological plasticity rules can be leveraged for rapid learning in neuromorphic dynamical systems.
Discovering neural computations from natural behavior data
With modern recording technology and tracking algorithms, collecting and pre-processing natural animal behavior data has never been easier. How can we best use this data to gain deeper insights into brain function? In a recent study we showed how to use pure behavioral data from unrestrained, freely moving animals to test candidate neural codes for naturalistic sensory stimuli, using a method we termed Natural Continuation. By formalizing neural coding models encompassing distinct hypotheses, fitting these to existing neural data, then using them to simulate artificial neural activity alongside natural behavior data, we found that certain models' simulated activity could predict held-out behavior much better than others, sometimes even outperforming state-of-the-art deep networks. We tested our approach to understand neural population codes for fruit fly social communication signals during courtship, which revealed a previously unknown model for the neural coding of input sequences that generalizes naturally to other systems and has specific advantageous computational properties, such as the ability to retain fine-scale temporal information for long periods. This work thus shows how pure natural behavior data can be leveraged to generate insights into biologically plausible neural computations even before new neural data are collected---a crucial advance toward minimizing invasive animal surgeries and accelerating future neural experiments.