Skip to main content

Event Detail (Archived)

Flexible Multitask Computation in Recurrent Networks Utilizes Shared Dynamical Motifs

Center for Studies in Physics and Biology

  • This event already took place in December 2023
  • Carson Family Auditorium (CRC)

Event Details

Type
Special Seminar Series
Speaker(s)
David Sussillo, Ph.D., adjunct professor, electrical engineering, Stanford University; senior research scientist, Meta
Speaker bio(s)

Flexible computation is a hallmark of intelligent behavior. Yet, little is known about how neural networks contextually reconfigure for different computations. Humans are able to perform a new task without extensive training, presumably through the composition of elementary processes that were previously learned. Cognitive scientists have long hypothesized the possibility of a compositional neural code, where complex neural computations are made up of constituent components; however, the neural substrate underlying this structure remains elusive in biological and artificial neural networks. Here we identified an algorithmic neural substrate for compositional computation through the study of multitasking artificial recurrent neural networks. Dynamical systems analyses of networks revealed learned computational strategies that mirrored the modular subtask structure of the task-set used for training. Dynamical motifs such as attractors, decision boundaries and rotations were reused across different task computations. In summary, we present a conceptual framework that establishes dynamical motifs as a fundamental unit of computation, intermediate between the neuron and the network. As more whole brain imaging studies record neural activity from multiple specialized systems simultaneously, the framework of dynamical motifs will guide questions about specialization and generalization across brain regions.

Open to
Tri-Institutional


Calendar of Events & Lectures

Browse upcoming and past Events and Lectures by keyword, program, date and type