A survey on statistical theory of deep learning: Approximation, training dynamics, and generative models

N Suh, G Cheng - Annual Review of Statistics and Its Application, 2024 - annualreviews.org
In this article, we review the literature on statistical theories of neural networks from three
perspectives: approximation, training dynamics, and generative models. In the first part …

Score approximation, estimation and distribution recovery of diffusion models on low-dimensional data

M Chen, K Huang, T Zhao… - … Conference on Machine …, 2023 - proceedings.mlr.press
Diffusion models achieve state-of-the-art performance in various generation tasks. However,
their theoretical foundations fall far behind. This paper studies score approximation …

How well generative adversarial networks learn distributions

T Liang - Journal of Machine Learning Research, 2021 - jmlr.org
This paper studies the rates of convergence for learning distributions implicitly with the
adversarial framework and Generative Adversarial Networks (GANs), which subsume …

An error analysis of generative adversarial networks for learning distributions

J Huang, Y Jiao, Z Li, S Liu, Y Wang, Y Yang - Journal of machine learning …, 2022 - jmlr.org
This paper studies how well generative adversarial networks (GANs) learn probability
distributions from finite samples. Our main results establish the convergence rates of GANs …

Statistical guarantees for generative models without domination

N Schreuder, VE Brunel… - Algorithmic Learning …, 2021 - proceedings.mlr.press
In this paper, we introduce a convenient framework for studying (adversarial) generative
models from a statistical perspective. It consists in modeling the generative device as a …

Deep conditional generative learning: Model and error analysis

J Chang, Z Ding, Y Jiao, R Li, J Zhijian Yang - arXiv e-prints, 2024 - ui.adsabs.harvard.edu
Abstract We introduce an Ordinary Differential Equation (ODE) based deep generative
method for learning a conditional distribution, named the Conditional Follmer Flow. Starting …

On the capacity of deep generative networks for approximating distributions

Y Yang, Z Li, Y Wang - Neural networks, 2022 - Elsevier
We study the efficacy and efficiency of deep generative networks for approximating
probability distributions. We prove that neural networks can transform a low-dimensional …

On statistical rates and provably efficient criteria of latent diffusion transformers (dits)

JYC Hu, W Wu, Z Song, H Liu - arXiv preprint arXiv:2407.01079, 2024 - arxiv.org
We investigate the statistical and computational limits of latent Diffusion Transformers (DiTs)
under the low-dimensional linear latent space assumption. Statistically, we study the …

Classification logit two-sample testing by neural networks for differentiating near manifold densities

X Cheng, A Cloninger - IEEE transactions on information theory, 2022 - ieeexplore.ieee.org
The recent success of generative adversarial networks and variational learning suggests
that training a classification network may work well in addressing the classical two-sample …

Double generative adversarial networks for conditional independence testing

C Shi, T Xu, W Bergsma, L Li - Journal of Machine Learning Research, 2021 - jmlr.org
In this article, we study the problem of high-dimensional conditional independence testing, a
key building block in statistics and machine learning. We propose an inferential procedure …