Hanson-wright inequality and sub-gaussian concentration

M Rudelson, R Vershynin - 2013 - projecteuclid.org
In this expository note, we give a modern proof of Hanson-Wright inequality for quadratic
forms in sub-gaussian random variables. We deduce a useful concentration inequality for …

Discriminative K-SVD for dictionary learning in face recognition

Q Zhang, B Li - 2010 IEEE computer society conference on …, 2010 - ieeexplore.ieee.org
In a sparse-representation-based face recognition scheme, the desired dictionary should
have good representational power (ie, being able to span the subspace of all faces) while …

Sparser johnson-lindenstrauss transforms

DM Kane, J Nelson - Journal of the ACM (JACM), 2014 - dl.acm.org
We give two different and simple constructions for dimensionality reduction in ℓ 2 via linear
mappings that are sparse: only an O (ε)-fraction of entries in each column of our embedding …

Near-optimal cryptographic hardness of agnostically learning halfspaces and relu regression under gaussian marginals

I Diakonikolas, D Kane, L Ren - International Conference on …, 2023 - proceedings.mlr.press
We study the task of agnostically learning halfspaces under the Gaussian distribution.
Specifically, given labeled examples $(\\mathbf {x}, y) $ from an unknown distribution on …

Reliably learning the relu in polynomial time

S Goel, V Kanade, A Klivans… - Conference on Learning …, 2017 - proceedings.mlr.press
We give the first dimension-efficient algorithms for learning Rectified Linear Units (ReLUs),
which are functions of the form $\mathbf {x}\mapsto\mathsf {max}(0,\mathbf {w}⋅\mathbf {x}) …

Near-optimal sq lower bounds for agnostically learning halfspaces and relus under gaussian marginals

I Diakonikolas, D Kane, N Zarifis - Advances in Neural …, 2020 - proceedings.neurips.cc
We study the fundamental problems of agnostically learning halfspaces and ReLUs under
Gaussian marginals. In the former problem, given labeled examples $(\bx, y) $ from an …

The optimality of polynomial regression for agnostic learning under gaussian marginals in the SQ model

I Diakonikolas, DM Kane, T Pittas… - … on Learning Theory, 2021 - proceedings.mlr.press
We study the problem of agnostic learning under the Gaussian distribution in the Statistical
Query (SQ) model. We develop a method for finding hard families of examples for a wide …

Statistical-query lower bounds via functional gradients

S Goel, A Gollakota, A Klivans - Advances in Neural …, 2020 - proceedings.neurips.cc
We give the first statistical-query lower bounds for agnostically learning any non-polynomial
activation with respect to Gaussian marginals (eg, ReLU, sigmoid, sign). For the specific …

Efficient testable learning of halfspaces with adversarial label noise

I Diakonikolas, D Kane, V Kontonis… - Advances in Neural …, 2024 - proceedings.neurips.cc
We give the first polynomial-time algorithm for the testable learning of halfspaces in the
presence of adversarial label noise under the Gaussian distribution. In the recently …

A moment-matching approach to testable learning and a new characterization of rademacher complexity

A Gollakota, AR Klivans, PK Kothari - Proceedings of the 55th Annual …, 2023 - dl.acm.org
A remarkable recent paper by Rubinfeld and Vasilyan (2022) initiated the study of testable
learning, where the goal is to replace hard-to-verify distributional assumptions (such as …