Robust estimators in high-dimensions without the computational intractability
We study high-dimensional distribution learning in an agnostic setting where an adversary is
allowed to arbitrarily corrupt an ε-fraction of the samples. Such questions have a rich history …
allowed to arbitrarily corrupt an ε-fraction of the samples. Such questions have a rich history …
Statistical query lower bounds for robust estimation of high-dimensional gaussians and gaussian mixtures
I Diakonikolas, DM Kane… - 2017 IEEE 58th Annual …, 2017 - ieeexplore.ieee.org
We describe a general technique that yields the first Statistical Query lower bounds for a
range of fundamental high-dimensional learning problems involving Gaussian distributions …
range of fundamental high-dimensional learning problems involving Gaussian distributions …
Robustly learning mixtures of k arbitrary Gaussians
We give a polynomial-time algorithm for the problem of robustly estimating a mixture of k
arbitrary Gaussians in ℝ d, for any fixed k, in the presence of a constant fraction of arbitrary …
arbitrary Gaussians in ℝ d, for any fixed k, in the presence of a constant fraction of arbitrary …
Robustly learning a gaussian: Getting optimal error, efficiently
We study the fundamental problem of learning the parameters of a high-dimensional
Gaussian in the presence of noise—where an ε-fraction of our samples were chosen by an …
Gaussian in the presence of noise—where an ε-fraction of our samples were chosen by an …
One Gate Makes Distribution Learning Hard
The task of learning a probability distribution from samples is ubiquitous across the natural
sciences. The output distributions of local quantum circuits are of central importance in both …
sciences. The output distributions of local quantum circuits are of central importance in both …
Robustly learning any clusterable mixture of gaussians
We study the efficient learnability of high-dimensional Gaussian mixtures in the outlier-
robust setting, where a small constant fraction of the data is adversarially corrupted. We …
robust setting, where a small constant fraction of the data is adversarially corrupted. We …
Learning geometric concepts with nasty noise
We study the efficient learnability of geometric concept classes—specifically, low-degree
polynomial threshold functions (PTFs) and intersections of halfspaces—when a fraction of …
polynomial threshold functions (PTFs) and intersections of halfspaces—when a fraction of …
[图书][B] Handbook of big data
This handbook provides a state-of-the-art overview of the analysis of large-scale datasets.
Featuring contributions from statistics and computer science experts in industry and …
Featuring contributions from statistics and computer science experts in industry and …
A single -gate makes distribution learning hard
The task of learning a probability distribution from samples is ubiquitous across the natural
sciences. The output distributions of local quantum circuits form a particularly interesting …
sciences. The output distributions of local quantum circuits form a particularly interesting …
Principled approaches to robust machine learning and beyond
JZ Li - 2018 - dspace.mit.edu
As we apply machine learning to more and more important tasks, it becomes increasingly
important that these algorithms are robust to systematic, or worse, malicious, noise. Despite …
important that these algorithms are robust to systematic, or worse, malicious, noise. Despite …