Overview frequency principle/spectral bias in deep learning

ZQJ Xu, Y Zhang, T Luo - Communications on Applied Mathematics and …, 2024 - Springer
Understanding deep learning is increasingly emergent as it penetrates more and more into
industry and science. In recent years, a research line from Fourier analysis sheds light on …

A unifying tutorial on approximate message passing

OY Feng, R Venkataramanan, C Rush… - … and Trends® in …, 2022 - nowpublishers.com
Over the last decade or so, Approximate Message Passing (AMP) algorithms have become
extremely popular in various structured high-dimensional statistical problems. Although the …

Generalisation error in learning with random features and the hidden manifold model

F Gerace, B Loureiro, F Krzakala… - International …, 2020 - proceedings.mlr.press
We study generalised linear regression and classification for a synthetically generated
dataset encompassing different problems of interest, such as learning with random features …

Modeling the influence of data structure on learning in neural networks: The hidden manifold model

S Goldt, M Mézard, F Krzakala, L Zdeborová - Physical Review X, 2020 - APS
Understanding the reasons for the success of deep neural networks trained using stochastic
gradient-based methods is a key open problem for the nascent theory of deep learning. The …

Modern applications of machine learning in quantum sciences

A Dawid, J Arnold, B Requena, A Gresch… - arXiv preprint arXiv …, 2022 - arxiv.org
In these Lecture Notes, we provide a comprehensive introduction to the most recent
advances in the application of machine learning methods in quantum sciences. We cover …

The gaussian equivalence of generative models for learning with shallow neural networks

S Goldt, B Loureiro, G Reeves… - Mathematical and …, 2022 - proceedings.mlr.press
Understanding the impact of data structure on the computational tractability of learning is a
key challenge for the theory of neural networks. Many theoretical works do not explicitly …

Bayes-optimal learning of deep random networks of extensive-width

H Cui, F Krzakala, L Zdeborová - … Conference on Machine …, 2023 - proceedings.mlr.press
We consider the problem of learning a target function corresponding to a deep, extensive-
width, non-linear neural network with random Gaussian weights. We consider the asymptotic …

Continual learning in the teacher-student setup: Impact of task similarity

S Lee, S Goldt, A Saxe - International Conference on …, 2021 - proceedings.mlr.press
Continual learning {—} the ability to learn many tasks in sequence {—} is critical for artificial
learning systems. Yet standard training methods for deep networks often suffer from …

Dynamics of stochastic gradient descent for two-layer neural networks in the teacher-student setup

S Goldt, M Advani, AM Saxe… - Advances in neural …, 2019 - proceedings.neurips.cc
Deep neural networks achieve stellar generalisation even when they have enough
parameters to easily fit all their training data. We study this phenomenon by analysing the …

Learning gaussian mixtures with generalized linear models: Precise asymptotics in high-dimensions

B Loureiro, G Sicuro, C Gerbelot… - Advances in …, 2021 - proceedings.neurips.cc
Generalised linear models for multi-class classification problems are one of the fundamental
building blocks of modern machine learning tasks. In this manuscript, we characterise the …