[HTML][HTML] Deep neural networks using a single neuron: folded-in-time architecture using feedback-modulated delay loops

F Stelzer, A Röhm, R Vicente, I Fischer… - Nature …, 2021 - nature.com
Deep neural networks are among the most widely applied machine learning tools showing
outstanding performance in a broad range of tasks. We present a method for folding a deep …

Fully decoupled neural network learning using delayed gradients

H Zhuang, Y Wang, Q Liu, Z Lin - IEEE transactions on neural …, 2021 - ieeexplore.ieee.org
Training neural networks with backpropagation (BP) requires a sequential passing of
activations and gradients. This has been recognized as the lockings (ie, the forward …

CORNN: Convex optimization of recurrent neural networks for rapid inference of neural dynamics

F Dinc, A Shai, M Schnitzer… - Advances in Neural …, 2023 - proceedings.neurips.cc
Advances in optical and electrophysiological recording technologies have made it possible
to record the dynamics of thousands of neurons, opening up new possibilities for interpreting …

Dissecting neural odes

S Massaroli, M Poli, J Park… - Advances in Neural …, 2020 - proceedings.neurips.cc
Continuous deep learning architectures have recently re-emerged as Neural Ordinary
Differential Equations (Neural ODEs). This infinite-depth approach theoretically bridges the …

Neural delay differential equations

Q Zhu, Y Guo, W Lin - arXiv preprint arXiv:2102.10801, 2021 - arxiv.org
Neural Ordinary Differential Equations (NODEs), a framework of continuous-depth neural
networks, have been widely applied, showing exceptional efficacy in coping with some …

SubFlow: A dynamic induced-subgraph strategy toward real-time DNN inference and training

S Lee, S Nirjon - 2020 IEEE Real-Time and Embedded …, 2020 - ieeexplore.ieee.org
We introduce SubFlow-a dynamic adaptation and execution strategy for a deep neural
network (DNN), which enables real-time DNN inference and training. The goal of SubFlow is …

Dynamic neural networks: A survey

Y Han, G Huang, S Song, L Yang… - IEEE Transactions on …, 2021 - ieeexplore.ieee.org
Dynamic neural network is an emerging research topic in deep learning. Compared to static
models which have fixed computational graphs and parameters at the inference stage …

Opening the black box: low-dimensional dynamics in high-dimensional recurrent neural networks

D Sussillo, O Barak - Neural computation, 2013 - direct.mit.edu
Recurrent neural networks (RNNs) are useful tools for learning nonlinear relationships
between time-varying inputs and outputs with complex temporal dependencies. Recently …

Tdsnn: From deep neural networks to deep spike neural networks with temporal-coding

L Zhang, S Zhou, T Zhi, Z Du, Y Chen - … of the AAAI conference on artificial …, 2019 - aaai.org
Continuous-valued deep convolutional networks (DNNs) can be converted into accurate
rate-coding based spike neural networks (SNNs). However, the substantial computational …

[HTML][HTML] Evolving artificial neural networks with feedback

S Herzog, C Tetzlaff, F Wörgötter - Neural Networks, 2020 - Elsevier
Neural networks in the brain are dominated by sometimes more than 60% feedback
connections, which most often have small synaptic weights. Different from this, little is known …