Implicit regularization in matrix factorization
S Gunasekar, BE Woodworth… - Advances in neural …, 2017 - proceedings.neurips.cc
We study implicit regularization when optimizing an underdetermined quadratic objective
over a matrix $ X $ with gradient descent on a factorization of X. We conjecture and provide …
over a matrix $ X $ with gradient descent on a factorization of X. We conjecture and provide …
Matrix factorization techniques in machine learning, signal processing, and statistics
Compressed sensing is an alternative to Shannon/Nyquist sampling for acquiring sparse or
compressible signals. Sparse coding represents a signal as a sparse linear combination of …
compressible signals. Sparse coding represents a signal as a sparse linear combination of …
Stochastic gradient descent, weighted sampling, and the randomized Kaczmarz algorithm
We improve a recent gurantee of Bach and Moulines on the linear convergence of SGD for
smooth and strongly convex objectives, reducing a quadratic dependence on the strong …
smooth and strongly convex objectives, reducing a quadratic dependence on the strong …
Regularized M-estimators with nonconvexity: Statistical and algorithmic theory for local optima
PL Loh, MJ Wainwright - Advances in Neural Information …, 2013 - proceedings.neurips.cc
We establish theoretical results concerning all local optima of various regularized M-
estimators, where both loss and penalty functions are allowed to be nonconvex. Our results …
estimators, where both loss and penalty functions are allowed to be nonconvex. Our results …
[PDF][PDF] partykit: A modular toolkit for recursive partytioning in R
The R package partykit provides a flexible toolkit for learning, representing, summarizing,
and visualizing a wide range of tree-structured regression and classification models. The …
and visualizing a wide range of tree-structured regression and classification models. The …
Inference and uncertainty quantification for noisy matrix completion
Noisy matrix completion aims at estimating a low-rank matrix given only partial and
corrupted entries. Despite remarkable progress in designing efficient estimation algorithms …
corrupted entries. Despite remarkable progress in designing efficient estimation algorithms …
Local low-rank matrix approximation
Matrix approximation is a common tool in recommendation systems, text mining, and
computer vision. A prevalent assumption in constructing matrix approximations is that the …
computer vision. A prevalent assumption in constructing matrix approximations is that the …
Convergence analysis for rectangular matrix completion using Burer-Monteiro factorization and gradient descent
Q Zheng, J Lafferty - arXiv preprint arXiv:1605.07051, 2016 - arxiv.org
We address the rectangular matrix completion problem by lifting the unknown matrix to a
positive semidefinite matrix in higher dimension, and optimizing a nonconvex objective over …
positive semidefinite matrix in higher dimension, and optimizing a nonconvex objective over …
Noisy low-rank matrix completion with general sampling distribution
O Klopp - 2014 - projecteuclid.org
In the present paper, we consider the problem of matrix completion with noise. Unlike
previous works, we consider quite general sampling distribution and we do not need to …
previous works, we consider quite general sampling distribution and we do not need to …
Optimistic rates: A unifying theory for interpolation learning and regularization in linear regression
We study a localized notion of uniform convergence known as an “optimistic rate”[,] for linear
regression with Gaussian data. Our refined analysis avoids the hidden constant and …
regression with Gaussian data. Our refined analysis avoids the hidden constant and …