Low-rank tensor methods for partial differential equations
M Bachmayr - Acta Numerica, 2023 - cambridge.org
Low-rank tensor representations can provide highly compressed approximations of
functions. These concepts, which essentially amount to generalizations of classical …
functions. These concepts, which essentially amount to generalizations of classical …
Learning with tree tensor networks: complexity estimates and model selection
A. Practical implementation: In this section, we present some practical aspects for the
implementation of our model selection strategy, that is a method for the calibration of the …
implementation of our model selection strategy, that is a method for the calibration of the …
Active learning of tree tensor networks using optimal least squares
In this paper, we propose new learning algorithms for approximating high-dimensional
functions using tree tensor networks in a least-squares setting. Given a dimension tree or …
functions using tree tensor networks in a least-squares setting. Given a dimension tree or …
Weighted sparsity and sparse tensor networks for least squares approximation
Approximation of high-dimensional functions is a problem in many scientific fields that is
only feasible if advantageous structural properties, such as sparsity in a given basis, can be …
only feasible if advantageous structural properties, such as sparsity in a given basis, can be …
Breaking the Curse of Dimensionality with Distributed Neural Computation
We present a theoretical approach to overcome the curse of dimensionality using a neural
computation algorithm which can be distributed across several machines. Our modular …
computation algorithm which can be distributed across several machines. Our modular …
Some Thoughts on Compositional Tensor Networks
R Schneider, M Oster - Multiscale, Nonlinear and Adaptive Approximation …, 2024 - Springer
In these notes we present some first ideas on the composition of tensor trains for the use in
scientific computing. We discuss the relation to deep neural networks and the potential role …
scientific computing. We discuss the relation to deep neural networks and the potential role …
Piecewise Polynomial Tensor Network Quantum Feature Encoding
This work introduces a novel method for embedding continuous variables into quantum
circuits via piecewise polynomial features, utilizing low-rank tensor networks. Our approach …
circuits via piecewise polynomial features, utilizing low-rank tensor networks. Our approach …
Learning high-dimensional probability distributions using tree tensor networks
We consider the problem of the estimation of a high-dimensional probability distribution from
iid samples of the distribution using model classes of functions in tree-based tensor formats …
iid samples of the distribution using model classes of functions in tree-based tensor formats …
[PDF][PDF] Approximation Rates and VC-Dimension Bounds for (P) ReLU MLP Mixture of Experts
Abstract Mixture-of-Experts (MoEs) can scale up beyond traditional deep learning models by
employing a routing strategy in which each input is processed by a single “expert” deep …
employing a routing strategy in which each input is processed by a single “expert” deep …
Analysis of variational quantum algorithms for differential equations in the presence of quantum noise: application to the stationary Gross-Pitaevskii equation
MF Serret - 2024 - theses.hal.science
Variational quantum algorithms (VQAs) have been proposed for solvingpartial differential
equations on quantum computers. This thesis focuses on analyzingVQAs for the stationary …
equations on quantum computers. This thesis focuses on analyzingVQAs for the stationary …