Abstract
AbstractNeural computations can be framed as dynamical processes, whereby the structure of the dynamics within a neural network are a direct reflection of the computations that the network performs. A key step in generating mechanistic interpretations within this computation through dynamics framework is to establish the link between network connectivity, dynamics and computation. This link is only partly understood. Recent work has focused on producing algorithms for engineering artificial recurrent neural networks (RNN) with dynamics targeted to a specific goal manifold. Some of these algorithms only require a set of vectors tangent to the target manifold to be computed, and thus provide a general method that can be applied to a diverse set of problems. Nevertheless, computing such vectors for an arbitrary manifold in a high dimensional state space remains highly challenging, which in practice limits the applicability of this approach. Here we demonstrate how topology and differential geometry can be leveraged to simplify this task, by first computing tangent vectors on a low-dimensional topological manifold and then embedding these in state space. The simplicity of this procedure greatly facilitates the creation of manifold-targeted RNNs, as well as the process of designing task-solving on-manifold dynamics. This new method should enable the application of network engineering-based approaches to a wide set of problems in neuroscience and machine learning. Furthermore, our description of how fundamental concepts from differential geometry can be mapped onto different aspects of neural dynamics is a further demonstration of how the language of differential geometry can enrich the conceptual framework for describing neural dynamics and computation.
Publisher
Cold Spring Harbor Laboratory
Reference29 articles.
1. Beiran, M. , Dubreuil, A. , Valente, A. , Mastrogiuseppe, F. , and Ostojic, S. (2020). Shaping dynamics with multiple populations in low-rank recurrent networks.
2. Biswas, T. and Fitzgerald, J. E. (2020). A geometric frame- work to predict structure from function in neural networks.
3. The intrinsic attractor manifold and pop- ulation dynamics of a canonical cognitive circuit across waking and sleep;Nat. Neurosci,2019
4. Chung, S. and Abbott, L. F. (2021). Neural population geome- try: An approach for understanding biological and artificial neural networks.
5. Darshan, R. and Rivkind, A. (2021). Learning to represent continuous variables in heterogeneous neural networks.