Rethinking lipschitz neural networks and certified robustness: A boolean function perspective
Designing neural networks with bounded Lipschitz constant is a promising way to obtain
certifiably robust classifiers against adversarial examples. However, the relevant progress …
certifiably robust classifiers against adversarial examples. However, the relevant progress …
Three Decades of Activations: A Comprehensive Survey of 400 Activation Functions for Neural Networks
V Kunc, J Kléma - arXiv preprint arXiv:2402.09092, 2024 - arxiv.org
Neural networks have proven to be a highly effective tool for solving complex problems in
many areas of life. Recently, their importance and practical usability have further been …
many areas of life. Recently, their importance and practical usability have further been …
Learning weakly convex regularizers for convergent image-reconstruction algorithms
We propose to learn non-convex regularizers with a prescribed upper bound on their weak-
convexity modulus. Such regularizers give rise to variational denoisers that minimize a …
convexity modulus. Such regularizers give rise to variational denoisers that minimize a …
A neural-network-based convex regularizer for inverse problems
The emergence of deep-learning-based methods to solve image-reconstruction problems
has enabled a significant increase in quality. Unfortunately, these new methods often lack …
has enabled a significant increase in quality. Unfortunately, these new methods often lack …
Improving Lipschitz-constrained neural networks by learning activation functions
Lipschitz-constrained neural networks have several advantages over unconstrained ones
and can be applied to a variety of problems, making them a topic of attention in the deep …
and can be applied to a variety of problems, making them a topic of attention in the deep …
[HTML][HTML] On the number of regions of piecewise linear neural networks
Many feedforward neural networks (NNs) generate continuous and piecewise-linear
(CPWL) mappings. Specifically, they partition the input domain into regions on which the …
(CPWL) mappings. Specifically, they partition the input domain into regions on which the …
[PDF][PDF] A neural-network-based convex regularizer for image reconstruction
The emergence of deep-learning-based methods for solving inverse problems has enabled
a significant increase in reconstruction quality. Unfortunately, these new methods often lack …
a significant increase in reconstruction quality. Unfortunately, these new methods often lack …
Provably convergent plug-and-play quasi-Newton methods
Plug-and-Play (PnP) methods are a class of efficient iterative methods that aim to combine
data fidelity terms and deep denoisers using classical optimization algorithms, such as ISTA …
data fidelity terms and deep denoisers using classical optimization algorithms, such as ISTA …
ABBA neural networks: Coping with positivity, expressivity, and robustness
We introduce ABBA networks, a novel class of (almost) nonnegative neural networks, which
are shown to possess a series of appealing properties. In particular, we demonstrate that …
are shown to possess a series of appealing properties. In particular, we demonstrate that …
Compact: Approximating complex activation functions for secure computation
Secure multi-party computation (MPC) techniques can be used to provide data privacy when
users query deep neural network (DNN) models hosted on a public cloud. State-of-the-art …
users query deep neural network (DNN) models hosted on a public cloud. State-of-the-art …