[HTML][HTML] Deep neural networks using a single neuron: folded-in-time architecture using feedback-modulated delay loops
Deep neural networks are among the most widely applied machine learning tools showing
outstanding performance in a broad range of tasks. We present a method for folding a deep …
outstanding performance in a broad range of tasks. We present a method for folding a deep …
Fully decoupled neural network learning using delayed gradients
Training neural networks with backpropagation (BP) requires a sequential passing of
activations and gradients. This has been recognized as the lockings (ie, the forward …
activations and gradients. This has been recognized as the lockings (ie, the forward …
CORNN: Convex optimization of recurrent neural networks for rapid inference of neural dynamics
Advances in optical and electrophysiological recording technologies have made it possible
to record the dynamics of thousands of neurons, opening up new possibilities for interpreting …
to record the dynamics of thousands of neurons, opening up new possibilities for interpreting …
Dissecting neural odes
Continuous deep learning architectures have recently re-emerged as Neural Ordinary
Differential Equations (Neural ODEs). This infinite-depth approach theoretically bridges the …
Differential Equations (Neural ODEs). This infinite-depth approach theoretically bridges the …
Neural delay differential equations
Neural Ordinary Differential Equations (NODEs), a framework of continuous-depth neural
networks, have been widely applied, showing exceptional efficacy in coping with some …
networks, have been widely applied, showing exceptional efficacy in coping with some …
SubFlow: A dynamic induced-subgraph strategy toward real-time DNN inference and training
We introduce SubFlow-a dynamic adaptation and execution strategy for a deep neural
network (DNN), which enables real-time DNN inference and training. The goal of SubFlow is …
network (DNN), which enables real-time DNN inference and training. The goal of SubFlow is …
Dynamic neural networks: A survey
Dynamic neural network is an emerging research topic in deep learning. Compared to static
models which have fixed computational graphs and parameters at the inference stage …
models which have fixed computational graphs and parameters at the inference stage …
Opening the black box: low-dimensional dynamics in high-dimensional recurrent neural networks
D Sussillo, O Barak - Neural computation, 2013 - direct.mit.edu
Recurrent neural networks (RNNs) are useful tools for learning nonlinear relationships
between time-varying inputs and outputs with complex temporal dependencies. Recently …
between time-varying inputs and outputs with complex temporal dependencies. Recently …
Tdsnn: From deep neural networks to deep spike neural networks with temporal-coding
Continuous-valued deep convolutional networks (DNNs) can be converted into accurate
rate-coding based spike neural networks (SNNs). However, the substantial computational …
rate-coding based spike neural networks (SNNs). However, the substantial computational …
[HTML][HTML] Evolving artificial neural networks with feedback
S Herzog, C Tetzlaff, F Wörgötter - Neural Networks, 2020 - Elsevier
Neural networks in the brain are dominated by sometimes more than 60% feedback
connections, which most often have small synaptic weights. Different from this, little is known …
connections, which most often have small synaptic weights. Different from this, little is known …