Sparse regularization via convex analysis

I Selesnick - IEEE Transactions on Signal Processing, 2017 - ieeexplore.ieee.org
Sparse approximate solutions to linear equations are classically obtained via L1 norm
regularized least squares, but this method often underestimates the true solution. As an …

Stein Unbiased GrAdient estimator of the Risk (SUGAR) for multiple parameter selection

CA Deledalle, S Vaiter, J Fadili, G Peyré - SIAM Journal on Imaging Sciences, 2014 - SIAM
Algorithms for solving variational regularization of ill-posed inverse problems usually involve
operators that depend on a collection of continuous parameters. When the operators enjoy …

Activity identification and local linear convergence of forward--backward-type methods

J Liang, J Fadili, G Peyré - SIAM Journal on Optimization, 2017 - SIAM
In this paper, we consider a class of Forward--Backward (FB) splitting methods that includes
several variants (eg, inertial schemes, FISTA) for minimizing the sum of two proper convex …

Implicit differentiation of lasso-type models for hyperparameter optimization

Q Bertrand, Q Klopfenstein, M Blondel… - International …, 2020 - proceedings.mlr.press
Abstract Setting regularization parameters for Lasso-type estimators is notoriously difficult,
though crucial for obtaining the best accuracy. The most popular hyperparameter …

[HTML][HTML] Statistical properties of convex clustering

KM Tan, D Witten - Electronic journal of statistics, 2015 - ncbi.nlm.nih.gov
In this manuscript, we study the statistical properties of convex clustering. We establish that
convex clustering is closely related to single linkage hierarchical clustering and k-means …

Learning the sampling pattern for MRI

F Sherry, M Benning, JC De los Reyes… - … on Medical Imaging, 2020 - ieeexplore.ieee.org
The discovery of the theory of compressed sensing brought the realisation that many inverse
problems can be solved even when measurements are “incomplete”. This is particularly …

Differentiating nonsmooth solutions to parametric monotone inclusion problems

J Bolte, E Pauwels, A Silveti-Falls - SIAM Journal on Optimization, 2024 - SIAM
We leverage path differentiability and a recent result on nonsmooth implicit differentiation
calculus to give sufficient conditions ensuring that the solution to a monotone inclusion …

Square Root LASSO: Well-Posedness, Lipschitz Stability, and the Tuning Trade-Off

A Berk, S Brugiapaglia, T Hoheisel - SIAM Journal on Optimization, 2024 - SIAM
This paper studies well-posedness and parameter sensitivity of the square root LASSO (SR-
LASSO), an optimization model for recovering sparse solutions to linear inverse problems in …

Corrected generalized cross-validation for finite ensembles of penalized estimators

PC Bellec, JH Du, T Koriyama, P Patil… - Journal of the Royal …, 2024 - academic.oup.com
Generalized cross-validation (GCV) is a widely used method for estimating the squared out-
of-sample prediction risk that employs scalar degrees of freedom adjustment (in a …

Local convergence properties of Douglas–Rachford and alternating direction method of multipliers

J Liang, J Fadili, G Peyré - Journal of Optimization Theory and Applications, 2017 - Springer
Abstract The Douglas–Rachford and alternating direction method of multipliers are two
proximal splitting algorithms designed to minimize the sum of two proper lower semi …