Deep ReLU neural networks in high-dimensional approximation
D Dũng - Neural Networks, 2021 - Elsevier
We study the computation complexity of deep ReLU (Rectified Linear Unit) neural networks
for the approximation of functions from the Hölder–Zygmund space of mixed smoothness …
for the approximation of functions from the Hölder–Zygmund space of mixed smoothness …
Simultaneous approximation of a smooth function and its derivatives by deep neural networks with piecewise-polynomial activations
This paper investigates the approximation properties of deep neural networks with
piecewise-polynomial activation functions. We derive the required depth, width, and sparsity …
piecewise-polynomial activation functions. We derive the required depth, width, and sparsity …
Function and derivative approximation by shallow neural networks
Y Li, S Lu - arXiv preprint arXiv:2407.05078, 2024 - arxiv.org
We investigate a Tikhonov regularization scheme specifically tailored for shallow neural
networks within the context of solving a classic inverse problem: approximating an unknown …
networks within the context of solving a classic inverse problem: approximating an unknown …
Computation complexity of deep ReLU neural networks in high-dimensional approximation
D Dũng, VK Nguyen, MX Thao - arXiv preprint arXiv:2103.00815, 2021 - arxiv.org
The purpose of the present paper is to study the computation complexity of deep ReLU
neural networks to approximate functions in H\" older-Nikol'skii spaces of mixed smoothness …
neural networks to approximate functions in H\" older-Nikol'skii spaces of mixed smoothness …
Collocation approximation by deep neural ReLU networks for parametric elliptic PDEs with lognormal inputs
D Dũng - arXiv preprint arXiv:2111.05504, 2021 - arxiv.org
We obtained convergence rates of the collocation approximation by deep ReLU neural
networks of solutions to elliptic PDEs with lognormal inputs, parametrized by $\boldsymbol …
networks of solutions to elliptic PDEs with lognormal inputs, parametrized by $\boldsymbol …
Approximation theory of tree tensor networks: Tensorized univariate functions
We study the approximation of univariate functions by combining tensorization of functions
with tensor trains (TTs)—a commonly used type of tensor networks (TNs). Lebesgue L p …
with tensor trains (TTs)—a commonly used type of tensor networks (TNs). Lebesgue L p …
Differentiable neural networks with repu activation: with applications to score estimation and isotonic regression
We study the properties of differentiable neural networks activated by rectified power unit
(RePU) functions. We show that the partial derivatives of RePU neural networks can be …
(RePU) functions. We show that the partial derivatives of RePU neural networks can be …
Theoretical Insights into CycleGAN: Analyzing Approximation and Estimation Errors in Unpaired Data Generation
In this paper, we focus on analyzing the excess risk of the unpaired data generation model,
called CycleGAN. Unlike classical GANs, CycleGAN not only transforms data between two …
called CycleGAN. Unlike classical GANs, CycleGAN not only transforms data between two …
Deep ReLU neural network approximation in Bochner spaces and applications to parametric PDEs
D Dũng, DT Pham - Journal of Complexity, 2023 - Elsevier
We investigate non-adaptive methods of deep ReLU neural network approximation in
Bochner spaces L 2 (U∞, X, μ) of functions on U∞ taking values in a separable Hilbert …
Bochner spaces L 2 (U∞, X, μ) of functions on U∞ taking values in a separable Hilbert …