A survey on statistical theory of deep learning: Approximation, training dynamics, and generative models
In this article, we review the literature on statistical theories of neural networks from three
perspectives: approximation, training dynamics, and generative models. In the first part …
perspectives: approximation, training dynamics, and generative models. In the first part …
Score approximation, estimation and distribution recovery of diffusion models on low-dimensional data
Diffusion models achieve state-of-the-art performance in various generation tasks. However,
their theoretical foundations fall far behind. This paper studies score approximation …
their theoretical foundations fall far behind. This paper studies score approximation …
How well generative adversarial networks learn distributions
T Liang - Journal of Machine Learning Research, 2021 - jmlr.org
This paper studies the rates of convergence for learning distributions implicitly with the
adversarial framework and Generative Adversarial Networks (GANs), which subsume …
adversarial framework and Generative Adversarial Networks (GANs), which subsume …
An error analysis of generative adversarial networks for learning distributions
This paper studies how well generative adversarial networks (GANs) learn probability
distributions from finite samples. Our main results establish the convergence rates of GANs …
distributions from finite samples. Our main results establish the convergence rates of GANs …
Statistical guarantees for generative models without domination
N Schreuder, VE Brunel… - Algorithmic Learning …, 2021 - proceedings.mlr.press
In this paper, we introduce a convenient framework for studying (adversarial) generative
models from a statistical perspective. It consists in modeling the generative device as a …
models from a statistical perspective. It consists in modeling the generative device as a …
Deep conditional generative learning: Model and error analysis
Abstract We introduce an Ordinary Differential Equation (ODE) based deep generative
method for learning a conditional distribution, named the Conditional Follmer Flow. Starting …
method for learning a conditional distribution, named the Conditional Follmer Flow. Starting …
On the capacity of deep generative networks for approximating distributions
We study the efficacy and efficiency of deep generative networks for approximating
probability distributions. We prove that neural networks can transform a low-dimensional …
probability distributions. We prove that neural networks can transform a low-dimensional …
On statistical rates and provably efficient criteria of latent diffusion transformers (dits)
We investigate the statistical and computational limits of latent Diffusion Transformers (DiTs)
under the low-dimensional linear latent space assumption. Statistically, we study the …
under the low-dimensional linear latent space assumption. Statistically, we study the …
Classification logit two-sample testing by neural networks for differentiating near manifold densities
X Cheng, A Cloninger - IEEE transactions on information theory, 2022 - ieeexplore.ieee.org
The recent success of generative adversarial networks and variational learning suggests
that training a classification network may work well in addressing the classical two-sample …
that training a classification network may work well in addressing the classical two-sample …
Double generative adversarial networks for conditional independence testing
In this article, we study the problem of high-dimensional conditional independence testing, a
key building block in statistics and machine learning. We propose an inferential procedure …
key building block in statistics and machine learning. We propose an inferential procedure …