Universal Rates for Regression: Separations between Cut-Off and Absolute Loss
In this work we initiate the study of regression in the universal rates framework of Bousquet
et al. Unlike the traditional uniform learning setting, we are interested in obtaining learning …
et al. Unlike the traditional uniform learning setting, we are interested in obtaining learning …
Regularization and optimal multiclass learning
The quintessential learning algorithm of empirical risk minimization (ERM) is known to fail in
various settings for which uniform convergence does not characterize learning. Relatedly …
various settings for which uniform convergence does not characterize learning. Relatedly …
Transfer learning beyond bounded density ratios
We study the fundamental problem of transfer learning where a learning algorithm collects
data from some source distribution $ P $ but needs to perform well with respect to a different …
data from some source distribution $ P $ but needs to perform well with respect to a different …
Online estimation via offline estimation: An information-theoretic framework
$$ The classical theory of statistical estimation aims to estimate a parameter of interest
under data generated from a fixed design (" offline estimation"), while the contemporary …
under data generated from a fixed design (" offline estimation"), while the contemporary …
Open Problem: Can Local Regularization Learn All Multiclass Problems?
Multiclass classification is the simple generalization of binary classification to arbitrary label
sets. Despite its simplicity, it has been remarkably resistant to study: a characterization of …
sets. Despite its simplicity, it has been remarkably resistant to study: a characterization of …
Sequential Probability Assignment with Contexts: Minimax Regret, Contextual Shtarkov Sums, and Contextual Normalized Maximum Likelihood
We study the fundamental problem of sequential probability assignment, also known as
online learning with logarithmic loss, with respect to an arbitrary, possibly nonparametric …
online learning with logarithmic loss, with respect to an arbitrary, possibly nonparametric …
Learnability is a Compact Property
Recent work on learning has yielded a striking result: the learnability of various problems
can be undecidable, or independent of the standard ZFC axioms of set theory. Furthermore …
can be undecidable, or independent of the standard ZFC axioms of set theory. Furthermore …
Is Transductive Learning Equivalent to PAC Learning?
Most work in the area of learning theory has focused on designing effective Probably
Approximately Correct (PAC) learners. Recently, other models of learning such as …
Approximately Correct (PAC) learners. Recently, other models of learning such as …
Multiclass Transductive Online Learning
We consider the problem of multiclass transductive online learning when the number of
labels can be unbounded. Previous works by Ben-David et al.[1997] and Hanneke et …
labels can be unbounded. Previous works by Ben-David et al.[1997] and Hanneke et …
Sample Compression Scheme Reductions
We present novel reductions from sample compression schemes in multiclass classification,
regression, and adversarially robust learning settings to binary sample compression …
regression, and adversarially robust learning settings to binary sample compression …