[HTML][HTML] Brain-inspired learning in artificial neural networks: a review

S Schmidgall, R Ziaei, J Achterberg, L Kirsch… - APL Machine …, 2024 - pubs.aip.org
Artificial neural networks (ANNs) have emerged as an essential tool in machine learning,
achieving remarkable success across diverse domains, including image and speech …

How connectivity structure shapes rich and lazy learning in neural circuits

YH Liu, A Baratin, J Cornford, S Mihalas… - arXiv preprint arXiv …, 2023 - arxiv.org
In theoretical neuroscience, recent work leverages deep learning tools to explore how some
network attributes critically influence its learning dynamics. Notably, initial weight …

Biologically-plausible backpropagation through arbitrary timespans via local neuromodulators

YH Liu, S Smith, S Mihalas… - Advances in Neural …, 2022 - proceedings.neurips.cc
The spectacular successes of recurrent neural network models where key parameters are
adjusted via backpropagation-based gradient descent have inspired much thought as to …

Representational drift as a result of implicit regularization

A Ratzon, D Derdikman, O Barak - Elife, 2024 - elifesciences.org
Recent studies show that, even in constant environments, the tuning of single neurons
changes over time in a variety of brain regions. This representational drift has been …

Evolutionary algorithms as an alternative to backpropagation for supervised training of Biophysical Neural Networks and Neural ODEs

J Hazelden, YH Liu, E Shlizerman… - arXiv preprint arXiv …, 2023 - arxiv.org
Training networks consisting of biophysically accurate neuron models could allow for new
insights into how brain circuits can organize and solve tasks. We begin by analyzing the …

[HTML][HTML] Transition to chaos separates learning regimes and relates to measure of consciousness in recurrent neural networks

D Mastrovito, YH Liu, L Kusmierz, E Shea-Brown… - bioRxiv, 2024 - ncbi.nlm.nih.gov
Recurrent neural networks exhibit chaotic dynamics when the variance in their connection
strengths exceed a critical value. Recent work indicates connection variance also modulates …

How Initial Connectivity Shapes Biologically Plausible Learning in Recurrent Neural Networks

W Liu, X Zhang, YH Liu - arXiv preprint arXiv:2410.11164, 2024 - arxiv.org
The impact of initial connectivity on learning has been extensively studied in the context of
backpropagation-based gradient descent, but it remains largely underexplored in …

Deep learning frameworks for modeling how neural circuits learn

YH Liu - 2024 - search.proquest.com
The brain's prowess in learning and adapting remains an enigma, particularly in its
approach to the'temporal credit assignment'problem. How do neural circuits determine …

Feedback control guides credit assignment in recurrent neural networks

K Kaleb, B Feulner, JA Gallego, C Clopath - The Thirty-eighth Annual … - openreview.net
How do brain circuits learn to generate behaviour? While significant strides have been
made in understanding learning in artificial neural networks, applying this knowledge to …