Research
We study how reliable perception, memory, and action emerge from dynamic neural circuits as animals engage with a changing world. We focus on two questions: how neural computations stabilize (or drift) across days to months, and how closed-loop sensing during movement supports rapid choices in orienting and navigation.
Current Projects
How feedback supports stable and flexible coding across cortical and subcortical visual circuits.
Circuit basis of active sensing and rapid orienting in natural vision.
How retrosplenial cortex (RSC) stabilizes context, space, and task variables during spatial learning.
How internal state and neuromodulators shape uncertainty-dependent dynamics across dorsal cortex.
Latent learning and cross-modal integration in cortical circuits.
Mini2P-based imaging infrastructure with synchronized behavior and cameras for freely moving experiments.
Focus Areas
- Long-term stabilization and drift across visual pathways
- Closed-loop sensing and action during natural vision and active orienting
- Spatial and contextual coding in retrosplenial–hippocampal circuits
- State-, neuromodulation-, and uncertainty-dependent dynamics in dorsal cortex
- Latent learning and crossmodal integration in cortical circuits
Methods
- Longitudinal optophysiology across synapses, cells, and circuits
- Miniature (mini2P) and bench-top two-photon imaging
- Custom closed-loop behavioral paradigms with multimodal head–eye–body and world tracking (ego- and exo-centric)
- Data-driven and hypothesis-driven computational modeling across spatial and temporal scales
Directions
- Identify circuit mechanisms that keep neural codes stable yet flexible across days to months, and test their causal role in behavior.
- Develop closed-loop readout and model-based perturbation from synapses to circuits, coupling high-throughput recording to targeted manipulation with a fast, reproducible pipeline for training, inference, and direct model-based intervention
- Link behavior and neural dynamics with predictive data-driven models trained on large multimodal datasets
- Pair large-scale recordings with spatial transcriptomics to resolve cell-type dependence and task-/activity-linked gene expression changes