Introduction to online convex optimization
E Hazan - Foundations and Trends® in Optimization, 2016 - nowpublishers.com
This monograph portrays optimization as a process. In many practical applications the
environment is so complex that it is infeasible to lay out a comprehensive theoretical model …
environment is so complex that it is infeasible to lay out a comprehensive theoretical model …
Low-rank compression of neural nets: Learning the rank of each layer
Y Idelbayev… - Proceedings of the IEEE …, 2020 - openaccess.thecvf.com
Neural net compression can be achieved by approximating each layer's weight matrix by a
low-rank matrix. The real difficulty in doing this is not in training the resulting neural net …
low-rank matrix. The real difficulty in doing this is not in training the resulting neural net …
A statistical perspective on algorithmic leveraging
One popular method for dealing with large-scale data sets is sampling. Using the empirical
statistical leverage scores as an importance sampling distribution, the method of algorithmic …
statistical leverage scores as an importance sampling distribution, the method of algorithmic …
Matrix completion for multi-label image classification
Recently, image categorization has been an active research topic due to the urgent need to
retrieve and browse digital images via semantic keywords. This paper formulates image …
retrieve and browse digital images via semantic keywords. This paper formulates image …
Structured sparsity through convex optimization
Sparse estimation methods are aimed at using or obtaining parsimonious representations of
data or models. While naturally cast as a combinatorial optimization problem, variable or …
data or models. While naturally cast as a combinatorial optimization problem, variable or …
Learning coupled feature spaces for cross-modal matching
Cross-modal matching has recently drawn much attention due to the widespread existence
of multimodal data. It aims to match data from different modalities, and generally involves …
of multimodal data. It aims to match data from different modalities, and generally involves …
Faster rates for the Frank-Wolfe method over strongly-convex sets
Abstract The Frank-Wolfe method (aka conditional gradient algorithm) for smooth
optimization has regained much interest in recent years in the context of large scale …
optimization has regained much interest in recent years in the context of large scale …
Global optimality in low-rank matrix optimization
This paper considers the minimization of a general objective function f (X) over the set of
rectangular n× m matrices that have rank at most r. To reduce the computational burden, we …
rectangular n× m matrices that have rank at most r. To reduce the computational burden, we …
Conditional gradient algorithms for norm-regularized smooth convex optimization
Motivated by some applications in signal processing and machine learning, we consider two
convex optimization problems where, given a cone KK, a norm ‖ ⋅ ‖‖·‖ and a smooth …
convex optimization problems where, given a cone KK, a norm ‖ ⋅ ‖‖·‖ and a smooth …
A new convex relaxation for tensor completion
B Romera-Paredes, M Pontil - Advances in neural …, 2013 - proceedings.neurips.cc
We study the problem of learning a tensor from a set of linear measurements. A prominent
methodology for this problem is based on the extension of trace norm regularization, which …
methodology for this problem is based on the extension of trace norm regularization, which …