To join the email distribution list of the cs colloquia, please visit the list subscription page.
Computer Science events calendar in HTTP ICS format for of Google calendars, and for Outlook.
Academic Calendar at Technion site.
Recurrent neural networks (RNNs) are a widely used models in neuroscience, whose architecture is inspired by the abundance of lateral connections in cortex. The basic computation in RNNs is a linear summation of inputs, followed by a pointwise nonlinearity. Outside neuroscience, similar models are used to describe ecological, transcriptional, and other biological networks. Real biological networks, however, exhibit many cooperative effects such as dendritic gating, neuromodulation, glia–neuron coupling and transcription factor dimerization. We ask how recurrent dynamics change when explicit three-body (quadratic) interactions are included. We introduce the Three-Body Recurrent Neural Network (TBRNN) and obtain four main results. First, we prove that TBRNNs are universal approximators for open dynamical systems. Second, we extend low-rank RNN theory to this setting, both to infer connectivity and to design functionality. Third, training on canonical neuroscience tasks reveals solution families that differ in their geometry from those found by standard RNNs, indicating that higher-order interactions reorganize the accessible dynamical regimes rather than merely re-parameterizing pairwise models. Fourth, we develop a practical model-comparison procedure that detects three-body signatures directly from trajectories, providing a way to infer when higher-order structure is present in recorded systems. Altogether, our work expands the space of dynamical models in neuroscience while also allowing other fields of biology to benefit from insights developed in neuroscience.