A Survey of L1 Regression

D Vidaurre, C Bielza… - International Statistical …, 2013 - Wiley Online Library
L1 regularization, or regularization with an L1 penalty, is a popular idea in statistics and
machine learning. This paper reviews the concept and application of L1 regularization for …

Optimization with sparsity-inducing penalties

F Bach, R Jenatton, J Mairal… - … and Trends® in …, 2012 - nowpublishers.com
Sparse estimation methods are aimed at using or obtaining parsimonious representations of
data or models. They were first dedicated to linear variable selection but numerous …

Classification and clustering via dictionary learning with structured incoherence and shared features

I Ramirez, P Sprechmann… - 2010 IEEE Computer …, 2010 - ieeexplore.ieee.org
A clustering framework within the sparse modeling and dictionary learning setting is
introduced in this work. Instead of searching for the set of centroid that best fit the data, as in …

Learning with submodular functions: A convex optimization perspective

F Bach - Foundations and Trends® in machine learning, 2013 - nowpublishers.com
Submodular functions are relevant to machine learning for at least two reasons:(1) some
problems may be expressed directly as the optimization of submodular functions and (2) the …

Structured sparsity through convex optimization

F Bach, R Jenatton, J Mairal, G Obozinski - Statistical Science, 2012 - projecteuclid.org
Sparse estimation methods are aimed at using or obtaining parsimonious representations of
data or models. While naturally cast as a combinatorial optimization problem, variable or …

[PDF][PDF] Proximal methods for hierarchical sparse coding

R Jenatton, J Mairal, G Obozinski, F Bach - The Journal of Machine …, 2011 - jmlr.org
Sparse coding consists in representing signals as sparse linear combinations of atoms
selected from a dictionary. We consider an extension of this framework where the atoms are …

Convex optimization with sparsity-inducing norms

F Bach, R Jenatton, J Mairal, G Obozinski - 2011 - direct.mit.edu
The principle of parsimony is central to many areas of science: the simplest explanation of a
given phenomenon should be preferred over more complicated ones. In the context of …

Gap safe screening rules for sparsity enforcing penalties

E Ndiaye, O Fercoq, J Salmon - Journal of Machine Learning Research, 2017 - jmlr.org
In high dimensional regression settings, sparsity enforcing penalties have proved useful to
regularize the data-fitting term. A recently introduced technique called screening rules …

Network flow algorithms for structured sparsity

J Mairal, R Jenatton, F Bach… - Advances in Neural …, 2010 - proceedings.neurips.cc
We consider a class of learning problems that involve a structured sparsity-inducing norm
defined as the sum of $\ell_\infty $-norms over groups of variables. Whereas a lot of effort …

C-HiLasso: A collaborative hierarchical sparse modeling framework

P Sprechmann, I Ramirez, G Sapiro… - IEEE Transactions on …, 2011 - ieeexplore.ieee.org
Sparse modeling is a powerful framework for data analysis and processing. Traditionally,
encoding in this framework is performed by solving an l 1-regularized linear regression …