A systematic review of graph neural network in healthcare-based applications: Recent advances, trends, and future directions
Graph neural network (GNN) is a formidable deep learning framework that enables the
analysis and modeling of intricate relationships present in data structured as graphs. In …
analysis and modeling of intricate relationships present in data structured as graphs. In …
Neural networks trained with SGD learn distributions of increasing complexity
The uncanny ability of over-parameterised neural networks to generalise well has been
explained using various" simplicity biases". These theories postulate that neural networks …
explained using various" simplicity biases". These theories postulate that neural networks …
Are Gaussian data all you need? The extents and limits of universality in high-dimensional generalized linear estimation
In this manuscript we consider the problem of generalized linear estimation on Gaussian
mixture data with labels given by a single-index model. Our first result is a sharp asymptotic …
mixture data with labels given by a single-index model. Our first result is a sharp asymptotic …
Universality laws for gaussian mixtures in generalized linear models
A recent line of work in high-dimensional statistics working under the Gaussian mixture
hypothesis has led to a number of results in the context of empirical risk minimization …
hypothesis has led to a number of results in the context of empirical risk minimization …
Graph-based approximate message passing iterations
C Gerbelot, R Berthier - Information and Inference: A Journal of …, 2023 - academic.oup.com
Approximate message passing (AMP) algorithms have become an important element of high-
dimensional statistical inference, mostly due to their adaptability and concentration …
dimensional statistical inference, mostly due to their adaptability and concentration …
High-dimensional asymptotics of denoising autoencoders
H Cui, L Zdeborová - Advances in Neural Information …, 2024 - proceedings.neurips.cc
We address the problem of denoising data from a Gaussian mixture using a two-layer non-
linear autoencoder with tied weights and a skip connection. We consider the high …
linear autoencoder with tied weights and a skip connection. We consider the high …
Fluctuations, bias, variance & ensemble of learners: Exact asymptotics for convex losses in high-dimension
From the sampling of data to the initialisation of parameters, randomness is ubiquitous in
modern Machine Learning practice. Understanding the statistical fluctuations engendered …
modern Machine Learning practice. Understanding the statistical fluctuations engendered …
On double-descent in uncertainty quantification in overparametrized models
Uncertainty quantification is a central challenge in reliable and trustworthy machine
learning. Naive measures such as last-layer scores are well-known to yield overconfident …
learning. Naive measures such as last-layer scores are well-known to yield overconfident …
Asymptotics of feature learning in two-layer networks after one gradient-step
In this manuscript we investigate the problem of how two-layer neural networks learn
features from data, and improve over the kernel regime, after being trained with a single …
features from data, and improve over the kernel regime, after being trained with a single …
Multinomial logistic regression: Asymptotic normality on null covariates in high-dimensions
This paper investigates the asymptotic distribution of the maximum-likelihood estimate
(MLE) in multinomial logistic models in the high-dimensional regime where dimension and …
(MLE) in multinomial logistic models in the high-dimensional regime where dimension and …