Recent advances in algorithmic high-dimensional robust statistics
I Diakonikolas, DM Kane - arXiv preprint arXiv:1911.05911, 2019 - arxiv.org
Learning in the presence of outliers is a fundamental problem in statistics. Until recently, all
known efficient unsupervised learning algorithms were very sensitive to outliers in high …
known efficient unsupervised learning algorithms were very sensitive to outliers in high …
[HTML][HTML] Poisoning attacks and countermeasures in intelligent networks: Status quo and prospects
Over the past years, the emergence of intelligent networks empowered by machine learning
techniques has brought great facilitates to different aspects of human life. However, using …
techniques has brought great facilitates to different aspects of human life. However, using …
Spectral signatures in backdoor attacks
A recent line of work has uncovered a new form of data poisoning: so-called backdoor
attacks. These attacks are particularly dangerous because they do not affect a network's …
attacks. These attacks are particularly dangerous because they do not affect a network's …
Gradient descent with early stopping is provably robust to label noise for overparameterized neural networks
Modern neural networks are typically trained in an over-parameterized regime where the
parameters of the model far exceed the size of the training data. Such neural networks in …
parameters of the model far exceed the size of the training data. Such neural networks in …
Learning with bad training data via iterative trimmed loss minimization
Y Shen, S Sanghavi - International conference on machine …, 2019 - proceedings.mlr.press
In this paper, we study a simple and generic framework to tackle the problem of learning
model parameters when a fraction of the training samples are corrupted. Our approach is …
model parameters when a fraction of the training samples are corrupted. Our approach is …
Byzantine stochastic gradient descent
This paper studies the problem of distributed stochastic optimization in an adversarial setting
where, out of $ m $ machines which allegedly compute stochastic gradients every iteration …
where, out of $ m $ machines which allegedly compute stochastic gradients every iteration …
Robust estimators in high-dimensions without the computational intractability
We study high-dimensional distribution learning in an agnostic setting where an adversary is
allowed to arbitrarily corrupt an ε-fraction of the samples. Such questions have a rich history …
allowed to arbitrarily corrupt an ε-fraction of the samples. Such questions have a rich history …
Mean estimation and regression under heavy-tailed distributions: A survey
G Lugosi, S Mendelson - Foundations of Computational Mathematics, 2019 - Springer
We survey some of the recent advances in mean estimation and regression function
estimation. In particular, we describe sub-Gaussian mean estimators for possibly heavy …
estimation. In particular, we describe sub-Gaussian mean estimators for possibly heavy …
Object pose estimation with statistical guarantees: Conformal keypoint detection and geometric uncertainty propagation
The two-stage object pose estimation paradigm first detects semantic keypoints on the
image and then estimates the 6D pose by minimizing reprojection errors. Despite performing …
image and then estimates the 6D pose by minimizing reprojection errors. Despite performing …
Robustness implies privacy in statistical estimation
We study the relationship between adversarial robustness and differential privacy in high-
dimensional algorithmic statistics. We give the first black-box reduction from privacy to …
dimensional algorithmic statistics. We give the first black-box reduction from privacy to …