DJ Strouse

Hertz Fellow: DJ Strouse

Princeton University

Area of Study

Theoretical Neuroscience

Fellowship Years

2013 - 2018

DJ Strouse is a PhD student in physics with Professor William Bialek at Princeton University, where he plans to continue his neuroscience research as well as branch out into other areas of biology. Currently, he is interested in prediction and inference in biological systems. His research focuses on trying to understand the design principles of biological systems. In particular, he tries to understand organisms as solutions to the computational problems that they face. By doing so, his and related research tries to provide “why” answers in a field traditionally focused on the “what” and “how.” In addition, this type of approach often leads to the fascinating realization that many biological systems approach optimality, that is the limits set by basic physical or statistical limits. In all cases, this approach necessitates a tight coupling between experimental and theoretical work and, on the theory side, liberal borrowing of tools from other fields, including statistical physics, information theory, machine learning, and dynamical systems.

Most of DJ’s recent work has focused on theoretical neuroscience. He has built a model of how synaptic plasticity and dendritic structure enable brains to rapidly and robustly encode information presented only once (with Bartlett Mel’s neural computation group at USC), studied how olfactory information is represented in the brain of mice during sniffing (with collaborators from Israel, Germany, and the US), built models for the role that dendrites play in single-neuron computations (on a Churchill Scholarship with Mate Lengyel’s computational learning and memory group at Cambridge), and formalized and explored the idea that sensory neural networks might be optimized to respond quickly to new stimuli by doing a mix of prediction and “anticipation”, i.e. sitting in network states which have fast transition times to states corresponding to likely stimuli (as a side project).


2018 - Compression, clustering, and learning with the deterministic information bottleneck