Improving Lipschitz-constrained neural networks by learning activation functions

S Ducotterd, A Goujon, P Bohra, D Perdios… - Journal of Machine …, 2024 - jmlr.org
Lipschitz-constrained neural networks have several advantages over unconstrained ones
and can be applied to a variety of problems, making them a topic of attention in the deep …

Ridges, neural networks, and the Radon transform

M Unser - Journal of Machine Learning Research, 2023 - jmlr.org
A ridge is a function that is characterized by a one-dimensional profile (activation) and a
multidimensional direction vector. Ridges appear in the theory of neural networks as …

From kernel methods to neural networks: A unifying variational formulation

M Unser - Foundations of Computational Mathematics, 2023 - Springer
The minimization of a data-fidelity term and an additive regularization functional gives rise to
a powerful framework for supervised learning. In this paper, we present a unifying …

Distributional Extension and Invertibility of the -Plane Transform and Its Dual

R Parhi, M Unser - SIAM Journal on Mathematical Analysis, 2024 - SIAM
We investigate the distributional extension of the-plane transform in and of related operators.
We parameterize the-plane domain as the Cartesian product of the Stiefel manifold of …

On the effect of initialization: The scaling path of 2-layer neural networks

S Neumayer, L Chizat, M Unser - Journal of Machine Learning Research, 2024 - jmlr.org
In supervised learning, the regularization path is sometimes used as a convenient
theoretical proxy for the optimization path of gradient descent initialized from zero. In this …

Deep Networks are Reproducing Kernel Chains

TJ Heeringa, L Spek, C Brune - arXiv preprint arXiv:2501.03697, 2025 - arxiv.org
Identifying an appropriate function space for deep neural networks remains a key open
question. While shallow neural networks are naturally associated with Reproducing Kernel …

[图书][B] On ridge splines, neural networks, and variational problems in Radon-domain BV spaces

R Parhi - 2022 - search.proquest.com
Deep neural networks are not well understood mathematically and their success in many
science and engineering applications is usually only backed by empirical evidence. In this …