Opening the black box: low-dimensional dynamics in high-dimensional recurrent neural networks

D Sussillo, O Barak - Neural computation, 2013 - direct.mit.edu
Recurrent neural networks (RNNs) are useful tools for learning nonlinear relationships
between time-varying inputs and outputs with complex temporal dependencies. Recently …

Teaching recurrent neural networks to infer global temporal structure from local examples

JZ Kim, Z Lu, E Nozari, GJ Pappas… - Nature Machine …, 2021 - nature.com
The ability to store and manipulate information is a hallmark of computational systems.
Whereas computers are carefully engineered to represent and perform mathematical …

On the difficulty of learning chaotic dynamics with RNNs

J Mikhaeil, Z Monfared… - Advances in Neural …, 2022 - proceedings.neurips.cc
Recurrent neural networks (RNNs) are wide-spread machine learning tools for modeling
sequential and time series data. They are notoriously hard to train because their loss …

Approximation and optimization theory for linear continuous-time recurrent neural networks

Z Li, J Han, E Weinan, Q Li - Journal of Machine Learning Research, 2022 - jmlr.org
We perform a systematic study of the approximation properties and optimization dynamics of
recurrent neural networks (RNNs) when applied to learn input-output relationships in …

Neural circuits as computational dynamical systems

D Sussillo - Current opinion in neurobiology, 2014 - Elsevier
Highlights•Many cortical circuits can be viewed as computational dynamical systems.•A new
tool to help us understand cortical dynamics is the optimized recurrent neural network …

Gradient-based learning drives robust representations in recurrent neural networks by balancing compression and expansion

M Farrell, S Recanatesi, T Moore, G Lajoie… - Nature Machine …, 2022 - nature.com
Neural networks need the right representations of input data to learn. Here we ask how
gradient-based learning shapes a fundamental property of representations in recurrent …

Coupled oscillatory recurrent neural network (cornn): An accurate and (gradient) stable architecture for learning long time dependencies

TK Rusch, S Mishra - arXiv preprint arXiv:2010.00951, 2020 - arxiv.org
Circuits of biological neurons, such as in the functional parts of the brain can be modeled as
networks of coupled oscillators. Inspired by the ability of these systems to express a rich set …

[HTML][HTML] full-FORCE: A target-based method for training recurrent networks

B DePasquale, CJ Cueva, K Rajan, GS Escola… - PloS one, 2018 - journals.plos.org
Trained recurrent networks are powerful tools for modeling dynamic neural computations.
We present a target-based method for modifying the full connectivity matrix of a recurrent …

Rnns incrementally evolving on an equilibrium manifold: A panacea for vanishing and exploding gradients?

A Kag, Z Zhang, V Saligrama - International Conference on …, 2020 - openreview.net
Recurrent neural networks (RNNs) are particularly well-suited for modeling long-term
dependencies in sequential data, but are notoriously hard to train because the error …

AntisymmetricRNN: A dynamical system view on recurrent neural networks

B Chang, M Chen, E Haber, EH Chi - arXiv preprint arXiv:1902.09689, 2019 - arxiv.org
Recurrent neural networks have gained widespread use in modeling sequential data.
Learning long-term dependencies using these models remains difficult though, due to …