Kan: Kolmogorov-arnold networks

Z Liu, Y Wang, S Vaidya, F Ruehle, J Halverson… - arXiv preprint arXiv …, 2024 - arxiv.org
Inspired by the Kolmogorov-Arnold representation theorem, we propose Kolmogorov-Arnold
Networks (KANs) as promising alternatives to Multi-Layer Perceptrons (MLPs). While MLPs …

Automated categorization of multiclass welding defects using the x-ray image augmentation and convolutional neural network

D Say, S Zidi, SM Qaisar, M Krichen - Sensors, 2023 - mdpi.com
The detection of weld defects by using X-rays is an important task in the industry. It requires
trained specialists with the expertise to conduct a timely inspection, which is costly and …

What kinds of functions do deep neural networks learn? Insights from variational spline theory

R Parhi, RD Nowak - SIAM Journal on Mathematics of Data Science, 2022 - SIAM
We develop a variational framework to understand the properties of functions learned by
fitting deep neural networks with rectified linear unit (ReLU) activations to data. We propose …

When deep learning meets polyhedral theory: A survey

J Huchette, G Muñoz, T Serra, C Tsay - arXiv preprint arXiv:2305.00241, 2023 - arxiv.org
In the past decade, deep learning became the prevalent methodology for predictive
modeling thanks to the remarkable accuracy of deep neural networks in tasks such as …

Three Decades of Activations: A Comprehensive Survey of 400 Activation Functions for Neural Networks

V Kunc, J Kléma - arXiv preprint arXiv:2402.09092, 2024 - arxiv.org
Neural networks have proven to be a highly effective tool for solving complex problems in
many areas of life. Recently, their importance and practical usability have further been …

Learning weakly convex regularizers for convergent image-reconstruction algorithms

A Goujon, S Neumayer, M Unser - SIAM Journal on Imaging Sciences, 2024 - SIAM
We propose to learn non-convex regularizers with a prescribed upper bound on their weak-
convexity modulus. Such regularizers give rise to variational denoisers that minimize a …

A neural-network-based convex regularizer for inverse problems

A Goujon, S Neumayer, P Bohra… - IEEE Transactions …, 2023 - ieeexplore.ieee.org
The emergence of deep-learning-based methods to solve image-reconstruction problems
has enabled a significant increase in quality. Unfortunately, these new methods often lack …

Improving Lipschitz-constrained neural networks by learning activation functions

S Ducotterd, A Goujon, P Bohra, D Perdios… - Journal of Machine …, 2024 - jmlr.org
Lipschitz-constrained neural networks have several advantages over unconstrained ones
and can be applied to a variety of problems, making them a topic of attention in the deep …

Scalable Bayesian uncertainty quantification with data-driven priors for radio interferometric imaging

TI Liaudat, M Mars, MA Price, M Pereyra… - RAS Techniques …, 2024 - academic.oup.com
Next-generation radio interferometers like the Square Kilometer Array have the potential to
unlock scientific discoveries thanks to their unprecedented angular resolution and …

ExSpliNet: An interpretable and expressive spline-based neural network

D Fakhoury, E Fakhoury, H Speleers - Neural Networks, 2022 - Elsevier
In this paper we present ExSpliNet, an interpretable and expressive neural network model.
The model combines ideas of Kolmogorov neural networks, ensembles of probabilistic trees …