Skip to main content

Past Seminars

January 9, 2024: (Special Time: 10AM, Physics Fellow Candidate) – Dominic Skinner, Northwestern University(opens in new window)
Statistical Physics Of Embryonic Transcriptomes Reveals Map Of Cellular Interactions.
Host: Eric Siggia
Starting from one totipotent cell, complex organisms form through a series of differentiation events, resulting in a multitude of cell types. In the ascidian embryo, differentiation happens early. For instance, by the 32 cell stage there are at least 10 transcriptomically distinct cell states. Moreover, cells coordinate within the embryo to differentiate in an extremely precise spatial pattern. Using recent single-cell sequencing data of early ascidian embryos, we leverage natural variation together with techniques from statistical physics to investigate development at the level of a complete interconnected embryo. After robustly identifying distinct transcriptomic states or cell types, a statistical analysis reveals correlations within embryos and across cell types beyond mean expression levels. From these intra-embryo correlations, we infer minimal networks of cell-cell interactions using regularization and spin glass-like models of interacting systems, revealing spatial connections that are of key importance in development.
January 10, 2024: (Special Time: 12:00PM) – Uri Alon, Weizmann Institute of Science(opens in new window)
Mathematical Essence Of Aging.
Host: Stan Leibler
Aging shows nearly universal quantitative patterns. We explain them using a stochastic ODE for damage production and removal, deduced from experiments on damage dynamics in mice and in individual bacteria, the latter done by us. This simple model explains a wide range of phenomena in human aging and age-related diseases, as well as in model organisms. It pinpoints core molecular and cellular drivers of aging, and suggests interventions that, at least in mice, can compress the relative sick span (fraction of lifespan that an individual is disabled).
January 11, 2024: (Special Time: 10AM, Physics Fellow Candidate) – Daniel Barabasi, Harvard University(opens in new window)
Nature Over Nurture: How Complex Computations Emerge From Developmental Priors.
Host: Eric Siggia
An important challenge of modern neuroscience is to unveil the mechanisms shaping the wiring of the connectome, answering the difficult question of how the brain wires itself. Neuronal systems display a high degree of wiring reproducibility, such that multiple circuits and architectural features appear to be identical within a species, invariants that network models are unable to explain. This is because the architectures of neural circuits aren’t fully learned, as recent advances in systems neuroscience and AI would have us believe, nor is our wiring directly determined by our DNA; instead, our genome provides assembly rules that cells use to self-organize into a functional brain, much like how cellular automata’s simple patterns generate complex emergent behavior. To illustrate how complex computations can emerge from seemingly noisy, self-assembling processes, I derive a neurodevelopmental encoding of artificial neural networks that considers the weight matrix of a neural network to be emergent from well-studied rules of neuronal compatibility. Rather than updating the network’s weights directly, we improve task fitness by updating the neurons’ wiring rules, thereby mirroring evolutionary selection on brain development. We find that our model (1) provides sufficient representational power for high accuracy on ML benchmarks while also compressing parameter count, and (2) can act as a regularizer, selecting simple circuits that provide stable and adaptive performance on metalearning tasks. Finally, I will discuss how such physical models of directed, self-assembling systems can both (1) advance developmental understanding and (2) provide a fresh perspective on how evolution balances nature and nurture in neural circuits.
January 16, 2024: (Special Time: 10AM, Physics Fellow Candidate) – Francois Bourassa, McGill University(opens in new window)
Theory Of Antigen Encoding And Cross-Receptor Interactions In T Cell Immunotherapy.
Host: Eric Siggia
The mechanisms connecting early T cell receptor (TCR) activation to complex T cell responses in the immune system have not been fully elucidated. Understanding these processes quantitatively is however crucial to fine-tune immunotherapy treatments against cancer. To systematically map out T cell activation, the lab of Grégoire Altan-Bonnet has developed a robotic platform which tracks over days the dynamics of messenger proteins, called cytokines, produced by T cells to communicate with other cells. We found a low-dimensional representation of high-dimensional cytokine dynamics in which trajectories are ordered according to antigen strength. We termed this property “antigen encoding” and quantified it using information theory and nonlinear dynamical equations. We then leveraged these insights to disentangle cross-receptor interactions in chimeric antigen receptor (CAR) T cells used in cancer immunotherapy. In particular, we developed an adaptive model of receptor proofreading to explain antagonism (i.e. inhibition) of CAR activation by weak TCR stimulation. Our model predictions quantitatively matched experimental data, enabling us to engineer antagonism to reduce CAR T cell toxicity against healthy tissues.
January 18, 2024: (Special Time: 10AM, Physics Fellow Candidate) – Jialong Jiang, California Institute of Technology(opens in new window)
Revealing Regulatory Network Organization Through Single-Cell Perturbation Profiling And Maximum Entropy Models.
Host: Eric Siggia
Gene regulatory networks control cellular information processing and response to signals and environmental changes. Perturbations are widely used in genetics to decode gene interactions, yet how to extract regulatory network models from large-scale single-cell perturbation profiling remains a significant challenge. We develop the framework, D-SPIN, that constructs regulatory network models from single-cell data collected across thousands of perturbations. Using the maximum entropy principle, D-SPIN identifies a unified regulatory network model of the single-cell population and the effects of each perturbation on the gene program level. D-SPIN enables accurate network reconstruction and provides a reasoning framework of how cell states are constructed through pairwise interactions between gene programs. Using genome-wide Perturb-seq data, D-SPIN reveals different strategies of homeostasis regulation in cancer cell stress response. Using drug profiling data, D-SPIN dissects response in heterogeneous cell populations to elucidate how drug combinations induce novel cell states through additive recruitment of gene programs. Moreover, D-SPIN facilitates perturbation design by finding sloppy model parameters and informative perturbations to pindown these parameters. The framework extends to a wide range of applications including signaling responses of immune populations and identifying cell-cell interaction networks from spatial transcriptomics profiling. In conclusion, D-SPIN provides a computational framework for constructing interpretable models of gene regulatory networks to reveal principles of cellular information processing and physiological control, and strategies for designing perturbation for efficient network inference.
February 15, 2024: (Special Time: 10AM, Physics Fellow Candidate) – Nikolas Schonsheck, University of Delaware(opens in new window)
Detecting And Learning Cyclic Structures In Neural Population Coding.
Host: Eric Siggia
Cyclic structures are a class of mesoscale features ubiquitous in both experimental stimuli and the activity of neural populations encoding them. Important examples include encoding of head direction, grid cells in spatial navigation, and orientation tuning in visual cortex. While cyclic structures are difficult to detect and analyze with classical methods, tools from the mathematical field of algebraic topology have proven to be particularly effective in understanding cyclic structures. Recently, work of Yoon et al. develops a topological framework to match cyclic coding patterns in distinct populations that encode the same information. We leverage this framework to study the efficacy of Hebbian learning rules in propagating cyclic structures through neural systems. Our primary results are 1) feedforward networks with connections drawn from inhibitory-biased random distributions do not reliably propagate cyclic features of neural coding 2) updating network connections with a biologically realistic Hebbian learning rule modeling spike timing dependent plasticity robustly constructs networks that do propagate cyclic features and 3) under biologically plausible parameter choices, the inhibition and propagation of such features can be modulated by the size of the output neuron population.
February 27, 2024: (4PM) – Noam Shental, Open University of Israel(opens in new window)
High-Resolution Microbial Profiling Of Novel Niches And A Pan-Microbiome Knowledge Base.
Host: Orli Snir
I will present two computational tools for microbiome research recently developed by my group – the first allows high-resolution microbial profiling of novel niches (https://www.biorxiv.org/content/10.1101/2023.09.03.556087v1, under review), and the second is a bacterial knowledge base that has collected more than 1.5 million sequence-to-phenotype associations and allows extracting pan-microbiome biological insights (https://academic.oup.com/nar/article/51/13/6593/7199329).
March 5, 2024: (4PM) – Liat Shenhav, New York University(opens in new window)
It’s About Time: Ecological And Eco-Evolutionary Dynamics Across The Scales.
Host: Bertrand Ottino-Loffler
Complex microbial communities play a vital role across many domains of life, from the female reproductive tract, through the oceans, to the plant rhizosphere. The study of these communities offers great opportunities for biological discovery, due to the ease of their measurement, the ability to perturb them, and their rapidly evolving nature. Yet, their complex composition, dynamic nature, and intricate interactions with multiple other systems, make it difficult to extract robust and reproducible patterns from these ecosystems. To uncover their latent properties, I develop models that combine longitudinal data analysis and statistical learning, and which draw from principles of community ecology, complexity theory and evolution. I will briefly present methods for decomposition of microbial dynamics at an ecological scale (Shenhav et al., Nature Methods; Martino & Shenhav et al., Nature Biotechnology). Using these methods we found significant differences in the trajectories of the infant microbiome in the first years of life as a function of early life exposures, namely mode of delivery and breastfeeding. I will then show how incorporating eco-evolutionary considerations allowed us to detect signals of purifying selection across ecosystems. I will demonstrate how interactions between evolution and ecology played a vital role in shaping microbial communities and the standard genetics code (Shenhav & Zeevi, Science, Liao & Shenhav Nature Comm.). Inspired by these discoveries, I am expanding the scope beyond the microbiome, modeling multi-layered data on human milk composition. I will present results from an ongoing study in which I am building integrative models of nasal, gut and milk microbiota, combined with human milk components, to predict infant respiratory health. I found that the temporal dynamics of microbiota in the first year of life, mediated by milk composition, predict the development of chronic respiratory disease later in childhood. These models, designed to identify robust spatiotemporal patterns, would help us better understand the nature and impact of complex ecosystems like the microbiome and human milk from the time of formation and throughout life.
March 19, 2024: (4PM) – Mason Porter, University of California, Los Angeles(opens in new window)
Topological Data Analysis Of Spatial Systems.
Host: Bertrand Ottino-Loffler
I will discuss topological data analysis (TDA), which uses ideas from topology to quantify the “shape” of data. I will focus in particular on persistent homology (PH), which one can use to find “holes” of different dimensions in data sets. I will start by introducing these ideas and will discuss a series of examples of TDA of spatial systems. The examples that I’ll discuss include voting data, the locations of polling sites, the spread of COVID-19, and the webs of spiders under the influence of various drugs.
March 26, 2024: (4PM) – Stefano Di Talia, Duke University(opens in new window)
Encoding Tissue Size And Shape During Vertebrate Regeneration.
Host: Woonyung Hur
Some animals have a remarkable ability to regenerate appendages and other damaged organs. I will focus on our attempts to reveal novel quantitative principles for the control of regeneration in zebrafish. I will describe how signaling waves and long-range gradients are used to control tissue growth and facilitate that tissues grow back to their correct size and shape.
April 2, 2024: (4PM) – Terry Hwa, University of California, San Diego(opens in new window)
Quantitative Rules Govern Protein Expression And Activity Across The Bacterial Phylogeny.
Host: Eric Siggia
Distinct bacterial species thrive under distinct growth conditions. Even species sharing similar optimal conditions can grow at vastly different rates; e.g., Vibrio natriegens grows more than 50% faster than E. coli and B. subtilis in the same common growth media at 37C. What do the super-fast growers do differently? Quantitative proteomics reveal surprisingly rigid programs of proteome allocation for bacteria irrespective of the phylogeny, distinguished mainly by the speed of their enzymes and by their metabolic orientations.
April 16, 2024: (4PM) – Michail Tsodyks, Institute for Advanced Studies(opens in new window)
Studying Human Memory for Random and Meaningful Material: A Comparative Study.
Host: Merav Stern
We consider the recognition and recall experiments on random lists of words vs meaningful narratives. A mathematical model based on a specific recall algorithm of random lists established the universal relation between the number of words that is retained in memory and the number of words that can on average be recalled, characterized by a square root scaling. This relation is expressed by an analytical expression with no free parameters and was confirmed experimentally to a surprising precision in online experiments. In order to extend this research to meaningful narratives, we took advantage of recently developed large language models that can generate meaningful text and respond to instructions in plain English with no additional training necessary. We developed a pipeline for designing large scale memory experiments and analyzing the obtained results. We performed online memory experiments with a large number of participants and collected recognition and recall data for narratives of different lengths. We found that both recall and recognition performance scale linearly with narrative length. Furthermore, in order to investigate the role of narrative comprehension in memory, we repeated these experiments using scrambled versions of the presented stories. We found that even though recall performance declined significantly, recognition remained largely unaffected. Interestingly, recalls in this condition seem to follow the original narrative order rather than the scrambled presentation, pointing to a contextual reconstruction of the story in memory.
April 23, 2024: (4PM) – Danny Abrams, Northwestern University(opens in new window)
Careful Or Colorful? The Evolution Of Animal Ornaments.
Host: Bertrand Ottino-Loffler
Extravagant and costly ornaments (e.g., deer antlers or peacock feathers) are found throughout the animal kingdom. Charles Darwin was the first to suggest that female courtship preferences drive ornament development through sexual selection. In this talk I will describe a minimal mathematical model for the evolution of animal ornaments and will show that even a greatly simplified model makes nontrivial predictions for the types of ornaments we expect to find in nature.
April 30, 2024: (4PM) – Vikram Gadagkar, Columbia University(opens in new window)
Neural Mechanisms Of Performance Evaluation In Singing Birds.
Host: Philip Kidd
Many behaviors are learned through trial and error by matching performance to internal goals, yet neural mechanisms of performance evaluation remain poorly understood. We recorded basal ganglia–projecting dopamine neurons in singing zebra finches as we controlled perceived song quality with distorted auditory feedback. Dopamine activity was suppressed after distorted syllables, consistent with worse-than-predicted performance, and activated when a predicted distortion did not occur, consistent with better-than-predicted performance. Thus, dopaminergic error signals can evaluate behaviors that are learned, not for reward, but by matching performance to internal goals. We then developed new computational methods to show that spontaneous dopamine activity correlated with natural song variations, demonstrating that dopamine can evaluate natural behavior unperturbed by experimental events such as cues, distortions, or rewards. Attending to mistakes during practicing alone provides opportunities for learning, but self-evaluation during audience-directed performance could distract from ongoing execution. It remains unknown how animals switch between, and process errors during, practice and performance modes. When male zebra finches transitioned from singing alone to singing female-directed courtship song, singing-related error signals were reduced or gated off and dopamine neurons were instead activated by female calls. Dopamine neurons can thus dynamically re-tune from self-evaluation to social feedback during courtship.
May 9, 2024: (4PM) – Arjun Karuvally, University of Massachusetts-Amherst(opens in new window)
Hidden Traveling Waves in Artificial Recurrent Neural Networks Encode Working Memory.
Host: Marcelo Magnasco
Traveling waves are integral to brain function and are hypothesized to be crucial for short-term information storage. This study introduces a theoretical model based on traveling wave dynamics within a lattice structure to simulate neural working memory. We theoretically analyze the model’s capacity to represent state and temporal information, which is vital for encoding the recent history in history-dependent dynamical systems. In addition to enabling robust short-term memory storage, our analysis reveals that these dynamics can alleviate the diminishing gradient problem, which poses a significant challenge in the practical training of recurrent neural architectures. We explore the model’s application under two boundary conditions: linear and non-linear, the latter driven by self-attention mechanisms. Experimental findings show that randomly initialized and backpropagation-trained Recurrent Neural Networks (RNNs) naturally exhibit linear traveling wave dynamics, suggesting a potential working memory mechanism within these networks. This mechanism remains concealed within the high-dimensional state space of the RNN and becomes apparent through a specific basis transformation proposed by our model. In contrast, the non-linear scenario aligns with autoregressive loops in attention-based transformers, which drive the AI revolution. The results highlight the profound impact of traveling waves on artificial intelligence, improving our understanding of existing black-box neural computation and offering a foundational theory for future enhancements in neural network design.
May 14, 2024: (4PM) – Jorn Dunkel, Massachusetts Institute of Technology(opens in new window)
Quantitative Model Inference For Living Matter.
Host: Eric Siggia
Recent advances in live-imaging techniques provide dynamical data ranging from the cellular to the organism scale. Notwithstanding such experimental progress, quantitative theoretical models often remain lacking, even for moderately complex classes of biological systems. Here, I will summarize our ongoing efforts to implement computational frameworks for inferring predictive dynamical equations from multi-scale imaging data. As specific examples, we will consider models for cell locomotion, neural dynamics, mosquito flight behavior, and collective animal swarming.
September 17, 2024: (4PM) – Baruch Barzel, Bar-Ilan University(opens in new window)
Network GPS – Navigating Network Dynamics.
Host: Merav Stern
In the past two decades we made significant advances in mapping the structure of social, biological and technological networks. The challenge that remains is to translate everything we know about network structure into its actual observed dynamics. In essence, whether it’s communicable diseases, genetic regulation, or the spread of failures in an infrastructure network, these dynamics boil down to the patterns of information spread in the network. It all begins with a local perturbation, such as a sudden disease outbreak or a local power failure, which then propagates to impact all other nodes. The challenge is that the resulting spatio-temporal propagation patterns are diverse and unpredictable – indeed, a zoo of spreading patterns – that seem to be only loosely connected to the network structure. We show that we can tame this zoo by exposing a systematic translation of network structural elements into their dynamic outcome, allowing us to navigate networks, and, most importantly, to expose a deep universality behind their seemingly diverse dynamics. Along the way, we predict how long it takes for viruses to spread between countries, which metabolites contribute most to the system’s information flow, and how to resuscitate a collapsed microbial network back into functionality.
October 15, 2024: (4PM) – Bertrand Ottino-Loffler, Rockefeller University
On Possible Indicators Of Negative Selection In Germinal Centers.
Host: Merav Stern
A central feature of vertebrate immune response is affinity maturation, wherein antibody-producing B cells undergo evolutionary selection in microanatomical structures called germinal centers, which form in secondary lymphoid organs upon antigen exposure. While it has been shown that the median B cell affinity dependably increases over the course of maturation, the exact logic behind this evolution remains vague. Three potential selection methods include encouraging the reproduction of high affinity cells (“birth/positive selection”), encouraging cell death in low affinity cells (“death/negative selection”), and adjusting the mutation rate based on cell affinity (“mutational selection”). While all three forms of selection would lead to a net increase in affinity, different selection methods may lead to distinct statistical dynamics. We present a tractable model of selection and analyze proposed signatures of negative selection. Given the simplicity of the model, such signatures should be stronger here than in real systems. However, we find a number of intuitively appealing metrics — such as preferential ancestry ratios, terminal node counts, and mutation count skewness — require nuance to properly interpret.
October 22, 2024: (4PM) – Karen E. Kasza, Columbia University(opens in new window)
Stress Management: Dissecting How Epithelial Tissues Flow And Fold Inside Developing Embryos.
Host: Eric Siggia
During embryonic development, groups of cells reorganize into functional tissues with complex form and structure. Tissue reorganization can be rapid and dramatic, often occurring through striking embryo-scale flows or folds that are mediated by the coordinated actions of hundreds or thousands of cells. These types of tissue movements can be driven by internal forces generated by the cells themselves or by external forces. While much is known about the molecules involved in these cell and tissue movements, it is not yet clear how these molecules work together to coordinate cell behaviors, give rise to emergent tissue mechanics, and generate coherent tissue movements at the embryo scale. To gain mechanistic insight into this problem, my lab develops and uses optogenetic technologies for manipulating mechanical activities of cells in the developing Drosophila embryo. First, I will discuss how mechanical forces are regulated in space and time to drive tissue flows that rapidly and symmetrically elongate the head-to-tail body axis of the embryo. Second, I will discuss some of our recent findings on the biological and physical mechanisms underlying distinct modes of generating curvature and folds in epithelial tissue sheets.
October 29, 2024: (4PM) – Francis Corson, Ecole Normale Superieure(opens in new window)
Spatial And Temporal Order In The Developing Drosophila Eye.
Host: Eric Siggia
There are many instances in development where a regular arrangement of cell fates self-organizes through cell-cell interactions, yet the dynamics by which these patterns arise, and the underlying logic, often remain elusive. In the developing Drosophila eye, regular rows of light-receiving units emerge in the wake of a traveling differentiation front to form a crystal-like array. The propagation of this pattern is thought to proceed by templating, with inhibitory signaling from each row providing a negative template for the next, but its dynamics had not been directly observed. Live imaging reveals unanticipated oscillations of the proneural factor Atonal, associated with pulses of Notch signaling activity. Our observations inform a new relay model for eye patterning, in which dynamic signaling from row n triggers differentiation at row n+2, conveying both spatial and temporal information to propagate crystal-like order.
November 12, 2024: (4PM) – Wilten Nicola, University of Calgary (opens in new window)
Learning And Memory In Stress Circuits.
Host: Merav Stern
The hypothalamic stress response is kicked off by the corticotropin releasing hormone (CRH) neurons of the paraventricular nucleus (PVN-CRH). As these neurons act as the final neural controller for the stress response, they are uniquely suited to adapting their neural responses to novel information in potentially stressful environments. Here, with both computational modelling and in vivo one-photon/miniscope recordings of PVN-CRH neurons, we show that these neurons change their tuning properties to novel environments with simple supervised learning rules in the absence or presence of threats or rewards. These changes persist across days and can be induced with only a single exposure to an environment paired with either an aversive stimulus (foot shock) or reward (Nutella). This work was performed in collaboration with the lab of Jaideep Bains[1]. References[1] Füzesi, T., Rasiah, N.P., Rosenegger, D.G., Rojas-Carvajal, M., Chomiak, T., Daviu, N., Molina, L.A., Simone, K., Sterley, T.L., Nicola, W. and Bains, J.S., 2023. Hypothalamic CRH neurons represent physiological memory of positive and negative experience. Nature Communications, 14(1), p.8522.
December 10, 2024: (4PM) – Anne Churchland, University of California, Los Angeles(opens in new window)
Movements And Engagement During Perceptual Decision-Making.
Host: Merav Stern
Switching between cognitive states is a natural tendency, even for trained experts. To test how cognitive state impacts the relationship between neural activity and behavior, we measured cortex-wide neural activity during decision-making in mice. Task variables and instructed movements elicited similar neural responses regardless of state, but the neural activity associated with spontaneous, uninstructed movements became highly variable during disengagement. Surprisingly, this heightened variability was not due to an increase in movements: behavioral videos showed equally frequent movements in both cognitive states. But while the movement frequency remained similar, movement timing changed: as animals slipped into disengagement, their movements became erratically timed. These idiosyncratic movements were a strong predictor of task performance and drove the increased variance that we observed in the neural activity. Taken together, our results argue that the temporal structure of movement patterns constitutes an embodied signature of cognitive state with profound impacts on neural activity.
December 17, 2024: (4PM) – Giancarlo La Camera, Stony Brook University (opens in new window)
Neural Mechanisms of Strategy-dependent Decision-making in the Prefrontal Cortex.
Host: Merav Stern
The ability to make decisions according to context is a hallmark of intelligent behavior. The prefrontal cortex (PFC) is known for processing contextual information, but many questions remain open. This is especially the case for “strategic” behavior where the context follows from abstract rules rather than dedicated input cues. In this work, we investigate the neural basis of two strategies called `repeat-stay’ and `change-shift’ strategy, respectively. These strategies have been observed in monkeys performing certain types of context-dependent tasks; in the task studied here, one of three targets are chosen based on an instruction stimulus and the outcome of previous trials. The same stimulus may instruct different decisions and the same decision may result from different stimuli, requiring the ability to develop strategic rules that span multiple trials. We found that PFC activity makes sharp transitions across latent neural states encoding task variables such as strategy, decision, action, reward, and previous-trial decisions. We compared two models able to perform the same task: a recurrent neural network (RNN) trained via backpropagation through time, and a multi-modular spiking network (MMSN) containing realistic ingredients of real cortical networks. Both models successfully attain levels of performance comparable to the monkeys’; however, the RNN seems to learn specific combinations of task conditions while the MMSN adopts the abstract strategies. The MMSN also reproduces the sequence of sharp transients observed in the PFC data, and explains some behavioral errors as the consequence of temporally misplaced transitions. In summary, the spiking network’s modular architecture suggests possible mechanisms for storing information across trials and subserve strategic behavior in complex tasks.

Contact Us

Center for Studies in Physics and Biology
The Rockefeller University
1230 York Avenue
New York, NY 10065