A high-bias, low-variance introduction to machine learning for physicists

P Mehta, M Bukov, CH Wang, AGR Day, C Richardson… - Physics reports, 2019 - Elsevier
Abstract Machine Learning (ML) is one of the most exciting and dynamic areas of modern
research and application. The purpose of this review is to provide an introduction to the core …

Statistical mechanics of deep learning

Y Bahri, J Kadmon, J Pennington… - Annual Review of …, 2020 - annualreviews.org
The recent striking success of deep neural networks in machine learning raises profound
questions about the theoretical principles underlying their success. For example, what can …

Beyond neural scaling laws: beating power law scaling via data pruning

B Sorscher, R Geirhos, S Shekhar… - Advances in …, 2022 - proceedings.neurips.cc
Widely observed neural scaling laws, in which error falls off as a power of the training set
size, model size, or both, have driven substantial performance improvements in deep …

[HTML][HTML] High-dimensional dynamics of generalization error in neural networks

MS Advani, AM Saxe, H Sompolinsky - Neural Networks, 2020 - Elsevier
We perform an analysis of the average generalization dynamics of large neural networks
trained using gradient descent. We study the practically-relevant “high-dimensional” regime …

Spectrum dependent learning curves in kernel regression and wide neural networks

B Bordelon, A Canatar… - … Conference on Machine …, 2020 - proceedings.mlr.press
We derive analytical expressions for the generalization performance of kernel regression as
a function of the number of training samples using theoretical methods from Gaussian …

Generalisation error in learning with random features and the hidden manifold model

F Gerace, B Loureiro, F Krzakala… - International …, 2020 - proceedings.mlr.press
We study generalised linear regression and classification for a synthetically generated
dataset encompassing different problems of interest, such as learning with random features …

Double trouble in double descent: Bias and variance (s) in the lazy regime

S d'Ascoli, M Refinetti, G Biroli… - … on Machine Learning, 2020 - proceedings.mlr.press
Deep neural networks can achieve remarkable generalization performances while
interpolating the training data. Rather than the U-curve emblematic of the bias-variance …

On simplicity and complexity in the brave new world of large-scale neuroscience

P Gao, S Ganguli - Current opinion in neurobiology, 2015 - Elsevier
Technological advances have dramatically expanded our ability to probe multi-neuronal
dynamics and connectivity in the brain. However, our ability to extract a simple conceptual …

[HTML][HTML] Accurate estimation of neural population dynamics without spike sorting

EM Trautmann, SD Stavisky, S Lahiri, KC Ames… - Neuron, 2019 - cell.com
A central goal of systems neuroscience is to relate an organism's neural activity to behavior.
Neural population analyses often reduce the data dimensionality to focus on relevant activity …

Statistical mechanics of deep linear neural networks: The backpropagating kernel renormalization

Q Li, H Sompolinsky - Physical Review X, 2021 - APS
The groundbreaking success of deep learning in many real-world tasks has triggered an
intense effort to theoretically understand the power and limitations of deep learning in the …