Skip to main content

Seminars

January 14, 2025: (4PM)- Ned Wingreen, Princeton University
Capillary Attraction Underlies Bacterial Collective Dynamics. “Water Is The Driving Force Of All Nature.” — Leonardo da Vinci.
Host: Eric Siggia
Collective motion of active matter occurs in many living systems, such as bacterial communities, epithelial cell populations, bird flocks, and fish schools. A remarkable example can be found in the soil-dwelling bacterium Myxococcus xanthus. Key to the life cycle of M. xanthus cells is the formation of collective groups: they feed on prey in swarms and aggregate upon starvation. However, the physical mechanisms that keep M. xanthus cells together remains unclear. I’ll present a computational model to explore the role that capillary forces play in bacterial collective dynamics. The modeling results, combined with experiments, show that water menisci forming around bacteria mediate strong capillary attraction between cells. The model accounts for a variety of previously observed phases of collective dynamics as the result of a competition between cell-cell capillary attraction and cell motility. Finally, I’ll discuss the large-scale self-organization of bacterial populations and highlight the importance of capillary force in this process. Together, these results suggest that cell-cell capillary attraction provides a generic mechanism underpinning bacterial collective dynamics.
January 28, 2025: (4PM)- Jason Kim, Cornell University
Generating Interpretable, Reliable, And Quantitative Models Of Emergent Behavior From High-Dimensional Data.
Host: Eric Siggia
Natural systems with emergent behaviors often organize along nonlinear low-dimensional subsets of high-dimensional spaces. For example, despite the tens of thousands of genes in the human genome, the principled study of genomics is fruitful because biological processes rely on coordinated organization along lower dimensional subspaces of phenotypes. To uncover this organization, many dimensionality reduction techniques embed high-dimensional data in low-dimensional spaces by modeling local relationships between data points. However, these methods fail to directly model the subspaces in which the data reside, thereby limiting their ability to infer the biological processes that globally organize the data, and to generalize out-of-distribution. Here, we address this limitation by directly learning a nonlinear subspace that is well-behaved not only in regions where there are data, but also in regions where there are no data by regularizing the curvature of manifolds generated by autoencoders, a method we coin “Γ-autoencoder.” We demonstrate its utility in a wide range of datasets, including bulk RNA-seq from healthy and cancer tissues, single-cell RNA-seq from cell differentiation, and neural activity from the mouse hippocampus. We discover the global biological programs that emerge as relevant variables, demonstrate superior predictions on data from completely unseen out-of-distribution classes, and consistently learn the same nonlinear subspaces across different random initializations. Broadly, we anticipate that direct modeling of the low-dimensional subspaces that generate and organize data through regularizing the curvature of generative models will enable more interpretable, generalizable, and consistent models in any high-dimensional system with emergent low-dimensional behavior.
February 4, 2025: (4PM)- Yoav Soen, Weizmann Institute of Science
Reorganizations In Complex Systems – Adaptation By Natural Improvisation.
Host: Orli Snir
Traditional view of adaptation focuses on selection of adaptive variations without regard to how these variations come about in the first place (no consideration of emergence). And yet, every single animal is constantly undergoing newly forming variations in its epigenome, microbiome and even in its somatic genome. Many of these variations appear in novel combinations that are unique to the individual. Since every new variation is potentially harmful, it is not clear how every individual can tolerate large numbers of novel variations that are forming during their lifetime. We have previously hypothesized that every individual acquires new adaptations by undergoing stochastic variations under (existing) mechanistic constraints that suppress the likelihood of undergoing non-viable changes (i.e. reaching non-viable states). Our group is testing this hypothesis using experimental models of coping with severe conditions of stress mimicking unforeseen challenges. Experimental work-in-progress provide substantial evidence in support of emergent adaptation by constrained exploration (“improvisation”) during the lifetime of individual flies, as well as during the generation time of individual cells in culture. The feasibility of emergent adaptation of this kind is further supported by theoretical models of coping with “unforeseen challenges” presented to specific classes of complex systems. I will describe the conceptual problem of emergent adaptation and its hypothesized solution, present the experimental findings, and discuss the implications to our view of evolution. If time permits, I will also present and discuss the theoretical work-in-progress of emergent adaptation.
February 18, 2025: (4PM)- Gautum Reddy, Princeton University (Location: Smith Hall Annex, A-Level Physics Seminar Room)
Learning Spatial And Temporal Structure In Novel Environments.
Host: Eric Siggia
Learning involves forming associations between events that are separated in space and time. Classical theories of reinforcement learning (RL) explain many aspects of animal learning, but certain important puzzles remain unresolved. I will present two stories involving learning phenomena that are in apparent contradiction with established RL theory: (1) ‘a-ha’ moments while rodents learn to navigate maze-like environments, and (2) how animals measure the passage of time during classical conditioning.
February 25, 2025: (4PM)- Marcella Noorman, Howard Hughes Medical Institute (Janelia)
Maintaining And Updating Accurate Internal Representations Of Continuous Variables With A Handful Of Neurons.
Host: Nikolas Schonsheck
Many animals rely on persistent internal representations of continuous angular variables for working memory, motor control, and navigation. Theories have proposed that such representations are maintained by a class of recurrently connected networks called ring attractor networks. These networks rely on large numbers of neurons to maintain continuous and stable representations and to accurately integrate incoming signals. The head direction system of the fruit fly, however, seems to achieve these properties with a remarkably small network. These findings challenge our understanding of ring attractors and their putative implementation in neural circuits. In this talk, I will show analytically how small networks can overcome the constraints of their size to generate a ring attractor and are hence capable of stably maintaining an internal representation of a continuous, periodic variable. Further, I will show how ring attractors emerge in small threshold linear networks through the coordination of a discrete set of line attractors. More broadly, this work informs our understanding of the functional capabilities of small, discrete systems.
March 4, 2025: (CANCELLED)- Hava Siegelmann, University of Massachusetts, Amherst
AI for Autonomous Agents: Sequence AI and Peer Cooperative Lifelong Learning.
Host: Marcelo Magnasco
How come drones are still mainly human controlled and have such limited autonomy? First, drones operate under significant constraints, including limited computational power, energy capacity, and communication bandwidth. Reinforcement Learning fail to maintain optimal performance under such constraints. We propose sequence AI algorithms that significantly improving compute and energy efficiency. Among the key features are rapid onboard responses and adaptability in dynamic environmental changes, robustness to missing inputs, minimization of sensor usage and the ability to use cheaper sensors to greater effect, as well as making possible the use of cheaper hardware while maintaining peak effectiveness. Second issue is the need of communication and cooperation among drones. Distributed AI is known to suffer explosion of communication needs, and this is not available in realistic swarms of drones. We propose a cooperative AI where the agents are lifelong learners. On the go, they are able to update, learn from failures, and become more expert with more experience. This paradigm enables both collaborative AI without explosive communication as well as a great reduction in the required labeled data (teacher), since the agents peer-teach each other. We suggest that these two directions of research will advance us towards true safe autonomy.
March 11, 2025: (4PM)- Xaq Pitkow, Carnegie Mellon University
Principles For Control When Computation Is Costly.
Host: Nikolas Schonsheck
Thinking is hard. Sometimes it seems better just to hack a solution than to plan it carefully. Here we develop this idea quantitatively, defining a version of stochastic control that accounts for computational costs of inference. We apply this to Linear Quadratic Gaussian (LQG) control with an added internal cost on information. This creates a trade-off: an agent can obtain more utility overall by sacrificing some task performance, if doing so saves enough mental effort during inference. We discover that the rational strategy that solves the joint inference and control problem goes through phase transitions depending on the task demands, switching from a costly but optimal inference to a family of suboptimal inferences, each interpretable as misestimating the structure of the world. In all cases, the agent moves more to think less. This work provides a foundation for a new type of rational computations that could be used by both brains and machines under strong energy constraints.
March 25, 2025: (4PM)- Nathan Lord, University of Pittsburgh (Location: Smith Hall Annex, A-Level Physics Seminar Room)
Mechanisms of Robust Developmental Patterning.
Host: Amy Shyer/Alan Rodrigues
Embryos often communicate instructions to their cells using diffusible signaling molecules called morphogens. In textbook models, morphogens diffuse from a localized source to form a concentration gradient, and target cells select fates by measuring the local morphogen concentration. However, natural patterning systems often incorporate numerous co-factors and extensive signaling feedback, suggesting that embryos require additional means of control to generate reliable patterns. This talk will present our recent results that illuminate how additional regulatory features enable robust pattern formation by the morphogen Nodal in zebrafish embryogenesis. Using a series of mutant embryos engineered to have feedback-compromised patterning systems, we demonstrate that simple ligand diffusion and capture is sufficient to explain the formation of normal Nodal signaling patterns. We further show that negative feedback on signaling, though dispensable under normal circumstances, is required to correct perturbations. Finally, I will present a new optogenetic patterning platform that enables scalable and flexible manipulation of Nodal signaling patterns in live zebrafish embryos.
April 1, 2025: (4PM)- Carina Curto, Brown University
Architectural Constraints On Recurrent Network Dynamics.
Host: Nikolas Schonsheck
Recurrent neural networks are capable of producing a wide variety of nonlinear dynamics, including fixed point attractors, limit cycles, quasiperiodic attractors, and chaos. How does the network architecture enable and constrain these dynamics? This question is key to understanding the role of connectomes in neural computation: while a connectome cannot fully determine the function of a network, it creates and constrains the space of possibilities. We explore these constraints in the context of threshold-linear networks, a family of toy models that are simple enough to be studied mathematically while exhibiting the full range of nonlinear dynamics. We study the bifurcation theory as a function of both synaptic weights and neuromodulation and find that different architectures provide different types of constraints. Mathematically, this can be understood via the combinatorial geometry of certain hyperplane arrangements associated to the model.
April 8, 2025: (4PM)- Edouard Hannezo, Institute of Science and Technology Austria
Robustness Of Morphogenesis Via Mechanical Feedbacks.
Host: Amy Shyer/Alan Rodrigues
A central question in biology is how genetic information is integrated across many length scales to shape and pattern cells, organs and organisms. Theoretical biophysics have proven instrumental in proposing minimal conceptual frameworks to understand the self-organizing potential of living matter, as well as to identify key predictions that can be verified experimentally. However, a key feature of multicellular development is not simply the emergence of increasing complex shapes and form, but the fact that this process is robust and reproducible. In this talk, I will present two recent works from our group on understanding how checkpoints for robustness can emerge from simple mechanical principles. Firstly, in the context of intestinal organoid morphogenesis, we show how mechano-sensitive feedbacks can give rise to mechanical bistability, rendering morphogenesis robust to subsequent mechanical perturbations once it’s completed. Secondly, in the context of early mammalian embryogenesis, we show how mechanical compaction can buffer developmental variability and allow embryos to converge towards robust shapes.
April 15, 2025: (4PM)- Brent Doiron, University of Chicago
Shunting Conductance At Stimulus Onset Quenches Neuronal Variability.
Host: Nikolas Schonsheck
A wealth of experimental studies show that the trial-to-trial variability of neuronal activity is quenched during stimulus evoked responses. This fact has helped ground a popular view that the variability of spiking activity can be decomposed into two components. The first is due to irregular spike timing conditioned on the firing rate of a neuron (i.e. a Poisson process), and the second is the trial-to-trial variability of the firing rate itself. Quenching of the variability of the overall response is assumed to be a reflection of a suppression of firing rate variability. Network models have explained this phenomenon through a variety of circuit mechanisms. However, in all cases, from the vantage of a neuron embedded within the network, quenching of its response variability is inherited from its synaptic input. We analyze in vivo whole cell recordings from principal cells in layer (L) 2/3 of mouse visual cortex. While the variability of the membrane potential is quenched upon stimulation, the variability of excitatory and inhibitory currents afferent to the neuron are amplified. This discord complicates the simple inheritance assumption that underpins network models of neuronal variability. We propose and validate an alternative (yet not mutually exclusive) mechanism for the quenching of neuronal variability. We show how an increase in synaptic conductance in the evoked state shunts the transfer of current to the membrane potential, formally decoupling changes in their trial-to-trial variability. The ubiquity of conductance based neuronal transfer combined with the simplicity of our model, provides an appealing framework. In particular, it shows how the dependence of cellular properties upon neuronal state is a critical, yet often ignored, factor. Further, our mechanism does not require a decomposition of variability into spiking and firing rate components, thereby challenging a long-held view of neuronal activity.
April 22, 2025: (4PM)- Max Wilson, University of California, Santa Barbara
To Be Announced.
Host: Amy Shyer/Alan Rodrigues
To come.
April 29, 2025: (4PM)- Elias Barriga, Technical University Dresden
Mechanoelectrical Actuation Of Tissue Morphogenesis.
Host: Amy Shyer/Alan Rodrigues
To come.
May 6, 2025: (4PM)- Jasna Brujic, New York University (Location: Smith Hall Annex, A-Level Physics Seminar Room)
To Be Announced.
Host: Jialong Jiang
To come.
September 16, 2025: (4PM)- Hava Siegelmann, University of Massachusetts, Amherst
To Be Announced.
Host: Marcelo Magnasco
To come.
September 23, 2025: (4PM)- Mariela Petkova, Harvard University
Exploring The Tension Between Fidelity And Variability In Biology From Genetic To Neural Networks.
Host: Avi Flamholz
Biological systems constantly navigate a delicate balance between reproducibility and variability. Developmental processes exemplify remarkable precision—each person has five fingers —but neural circuits in the brain must maintain a flexible architecture to enable adaptive behavior. In this talk, I explore this tension through two complementary lenses: small genetic networks guiding precise cell identities in fly development, and large neural networks guiding animal behavior. Specific examples include how electric fish subtract self-generated electrical signals to accurately detect prey, and how zebrafish swim upstream in complete darkness by integrating local water flow rotations in a neural implementation of Stoke’s theorem. By performing quantitative measurements of gene expression and neural connectivity and relating them to the biological function of each network, I demonstrate that simple rules can emerge even from the most intricate networks.
September 30, 2025: (4PM)- Suckjoon Jun, University of California, San Diego
To Be Announced.
Host: Avi Flamholz
To come.
October 14, 2025: (4PM)
To Be Announced.
Host: TBD
To come.
October 28, 2025: (4PM)- Frederick A. Matsen, Fred Hutchinson Cancer Research Center
To Be Announced.
Host: Gabriel Victora
To come.
November 4, 2025: (4PM)
To Be Announced.
Host: TBD
To come.
November 11, 2025: (4PM)- Yufeng Shen, Columbia University
To Be Announced.
Host: Li Zhao
To come.
November 18, 2025: (4PM)
To Be Announced.
Host: TBD
To come.
December 2, 2025: (4PM)
To Be Announced.
Host: TBD
To come.
December 9, 2025: (4PM)
To Be Announced.
Host: TBD
To come.
December 16, 2025: (4PM)
To Be Announced.
Host: TBD
To come.

 

Past Seminars

Click here for past seminars from the Center for Studies in Physics and Biology.



Contact Us

Center for Studies in Physics and Biology
The Rockefeller University
1230 York Avenue
New York, NY 10065