Introduction to online convex optimization

E Hazan - Foundations and Trends® in Optimization, 2016 - nowpublishers.com
This monograph portrays optimization as a process. In many practical applications the
environment is so complex that it is infeasible to lay out a comprehensive theoretical model …

Low-rank compression of neural nets: Learning the rank of each layer

Y Idelbayev… - Proceedings of the IEEE …, 2020 - openaccess.thecvf.com
Neural net compression can be achieved by approximating each layer's weight matrix by a
low-rank matrix. The real difficulty in doing this is not in training the resulting neural net …

A statistical perspective on algorithmic leveraging

P Ma, M Mahoney, B Yu - International conference on …, 2014 - proceedings.mlr.press
One popular method for dealing with large-scale data sets is sampling. Using the empirical
statistical leverage scores as an importance sampling distribution, the method of algorithmic …

Matrix completion for multi-label image classification

R Cabral, F Torre, JP Costeira… - Advances in neural …, 2011 - proceedings.neurips.cc
Recently, image categorization has been an active research topic due to the urgent need to
retrieve and browse digital images via semantic keywords. This paper formulates image …

Structured sparsity through convex optimization

F Bach, R Jenatton, J Mairal, G Obozinski - Statistical Science, 2012 - projecteuclid.org
Sparse estimation methods are aimed at using or obtaining parsimonious representations of
data or models. While naturally cast as a combinatorial optimization problem, variable or …

Learning coupled feature spaces for cross-modal matching

K Wang, R He, W Wang, L Wang… - Proceedings of the …, 2013 - openaccess.thecvf.com
Cross-modal matching has recently drawn much attention due to the widespread existence
of multimodal data. It aims to match data from different modalities, and generally involves …

Faster rates for the Frank-Wolfe method over strongly-convex sets

D Garber, E Hazan - International Conference on Machine …, 2015 - proceedings.mlr.press
Abstract The Frank-Wolfe method (aka conditional gradient algorithm) for smooth
optimization has regained much interest in recent years in the context of large scale …

Global optimality in low-rank matrix optimization

Z Zhu, Q Li, G Tang, MB Wakin - IEEE Transactions on Signal …, 2018 - ieeexplore.ieee.org
This paper considers the minimization of a general objective function f (X) over the set of
rectangular n× m matrices that have rank at most r. To reduce the computational burden, we …

Conditional gradient algorithms for norm-regularized smooth convex optimization

Z Harchaoui, A Juditsky, A Nemirovski - Mathematical Programming, 2015 - Springer
Motivated by some applications in signal processing and machine learning, we consider two
convex optimization problems where, given a cone KK, a norm ‖ ⋅ ‖‖·‖ and a smooth …

A new convex relaxation for tensor completion

B Romera-Paredes, M Pontil - Advances in neural …, 2013 - proceedings.neurips.cc
We study the problem of learning a tensor from a set of linear measurements. A prominent
methodology for this problem is based on the extension of trace norm regularization, which …