Hanson-wright inequality and sub-gaussian concentration
M Rudelson, R Vershynin - 2013 - projecteuclid.org
In this expository note, we give a modern proof of Hanson-Wright inequality for quadratic
forms in sub-gaussian random variables. We deduce a useful concentration inequality for …
forms in sub-gaussian random variables. We deduce a useful concentration inequality for …
Discriminative K-SVD for dictionary learning in face recognition
In a sparse-representation-based face recognition scheme, the desired dictionary should
have good representational power (ie, being able to span the subspace of all faces) while …
have good representational power (ie, being able to span the subspace of all faces) while …
Sparser johnson-lindenstrauss transforms
We give two different and simple constructions for dimensionality reduction in ℓ 2 via linear
mappings that are sparse: only an O (ε)-fraction of entries in each column of our embedding …
mappings that are sparse: only an O (ε)-fraction of entries in each column of our embedding …
Near-optimal cryptographic hardness of agnostically learning halfspaces and relu regression under gaussian marginals
I Diakonikolas, D Kane, L Ren - International Conference on …, 2023 - proceedings.mlr.press
We study the task of agnostically learning halfspaces under the Gaussian distribution.
Specifically, given labeled examples $(\\mathbf {x}, y) $ from an unknown distribution on …
Specifically, given labeled examples $(\\mathbf {x}, y) $ from an unknown distribution on …
Reliably learning the relu in polynomial time
We give the first dimension-efficient algorithms for learning Rectified Linear Units (ReLUs),
which are functions of the form $\mathbf {x}\mapsto\mathsf {max}(0,\mathbf {w}⋅\mathbf {x}) …
which are functions of the form $\mathbf {x}\mapsto\mathsf {max}(0,\mathbf {w}⋅\mathbf {x}) …
Near-optimal sq lower bounds for agnostically learning halfspaces and relus under gaussian marginals
We study the fundamental problems of agnostically learning halfspaces and ReLUs under
Gaussian marginals. In the former problem, given labeled examples $(\bx, y) $ from an …
Gaussian marginals. In the former problem, given labeled examples $(\bx, y) $ from an …
The optimality of polynomial regression for agnostic learning under gaussian marginals in the SQ model
We study the problem of agnostic learning under the Gaussian distribution in the Statistical
Query (SQ) model. We develop a method for finding hard families of examples for a wide …
Query (SQ) model. We develop a method for finding hard families of examples for a wide …
Statistical-query lower bounds via functional gradients
We give the first statistical-query lower bounds for agnostically learning any non-polynomial
activation with respect to Gaussian marginals (eg, ReLU, sigmoid, sign). For the specific …
activation with respect to Gaussian marginals (eg, ReLU, sigmoid, sign). For the specific …
Efficient testable learning of halfspaces with adversarial label noise
We give the first polynomial-time algorithm for the testable learning of halfspaces in the
presence of adversarial label noise under the Gaussian distribution. In the recently …
presence of adversarial label noise under the Gaussian distribution. In the recently …
A moment-matching approach to testable learning and a new characterization of rademacher complexity
A remarkable recent paper by Rubinfeld and Vasilyan (2022) initiated the study of testable
learning, where the goal is to replace hard-to-verify distributional assumptions (such as …
learning, where the goal is to replace hard-to-verify distributional assumptions (such as …