A closer look at accuracy vs. robustness
Current methods for training robust networks lead to a drop in test accuracy, which has led
prior works to posit that a robustness-accuracy tradeoff may be inevitable in deep learning …
prior works to posit that a robustness-accuracy tradeoff may be inevitable in deep learning …
Certified robustness via dynamic margin maximization and improved lipschitz regularization
To improve the robustness of deep classifiers against adversarial perturbations, many
approaches have been proposed, such as designing new architectures with better …
approaches have been proposed, such as designing new architectures with better …
Robustness in deep learning: The good (width), the bad (depth), and the ugly (initialization)
We study the average robustness notion in deep neural networks in (selected) wide and
narrow, deep and shallow, as well as lazy and non-lazy training settings. We prove that in …
narrow, deep and shallow, as well as lazy and non-lazy training settings. We prove that in …
DECODE: Deep confidence network for robust image classification
Recent years have witnessed the success of deep convolutional neural networks for image
classification and many related tasks. It should be pointed out that the existing training …
classification and many related tasks. It should be pointed out that the existing training …
Rethinking lipschitz neural networks and certified robustness: A boolean function perspective
Designing neural networks with bounded Lipschitz constant is a promising way to obtain
certifiably robust classifiers against adversarial examples. However, the relevant progress …
certifiably robust classifiers against adversarial examples. However, the relevant progress …
Direct parameterization of lipschitz-bounded deep networks
R Wang, I Manchester - International Conference on …, 2023 - proceedings.mlr.press
This paper introduces a new parameterization of deep neural networks (both fully-connected
and convolutional) with guaranteed $\ell^ 2$ Lipschitz bounds, ie limited sensitivity to input …
and convolutional) with guaranteed $\ell^ 2$ Lipschitz bounds, ie limited sensitivity to input …
A systematic review of robustness in deep learning for computer vision: Mind the gap?
Deep neural networks for computer vision are deployed in increasingly safety-critical and
socially-impactful applications, motivating the need to close the gap in model performance …
socially-impactful applications, motivating the need to close the gap in model performance …
The robustness of deep networks: A geometrical perspective
A Fawzi, SM Moosavi-Dezfooli… - IEEE Signal Processing …, 2017 - ieeexplore.ieee.org
Deep neural networks have recently shown impressive classification performance on a
diverse set of visual tasks. When deployed in real-world (noise-prone) environments, it is …
diverse set of visual tasks. When deployed in real-world (noise-prone) environments, it is …
Almost-orthogonal layers for efficient general-purpose lipschitz networks
B Prach, CH Lampert - European Conference on Computer Vision, 2022 - Springer
It is a highly desirable property for deep networks to be robust against small input changes.
One popular way to achieve this property is by designing networks with a small Lipschitz …
One popular way to achieve this property is by designing networks with a small Lipschitz …
Why robust generalization in deep learning is difficult: Perspective of expressive power
It is well-known that modern neural networks are vulnerable to adversarial examples. To
mitigate this problem, a series of robust learning algorithms have been proposed. However …
mitigate this problem, a series of robust learning algorithms have been proposed. However …