Sparse regularization via convex analysis
I Selesnick - IEEE Transactions on Signal Processing, 2017 - ieeexplore.ieee.org
Sparse approximate solutions to linear equations are classically obtained via L1 norm
regularized least squares, but this method often underestimates the true solution. As an …
regularized least squares, but this method often underestimates the true solution. As an …
Stein Unbiased GrAdient estimator of the Risk (SUGAR) for multiple parameter selection
Algorithms for solving variational regularization of ill-posed inverse problems usually involve
operators that depend on a collection of continuous parameters. When the operators enjoy …
operators that depend on a collection of continuous parameters. When the operators enjoy …
Activity identification and local linear convergence of forward--backward-type methods
In this paper, we consider a class of Forward--Backward (FB) splitting methods that includes
several variants (eg, inertial schemes, FISTA) for minimizing the sum of two proper convex …
several variants (eg, inertial schemes, FISTA) for minimizing the sum of two proper convex …
Implicit differentiation of lasso-type models for hyperparameter optimization
Abstract Setting regularization parameters for Lasso-type estimators is notoriously difficult,
though crucial for obtaining the best accuracy. The most popular hyperparameter …
though crucial for obtaining the best accuracy. The most popular hyperparameter …
[HTML][HTML] Statistical properties of convex clustering
In this manuscript, we study the statistical properties of convex clustering. We establish that
convex clustering is closely related to single linkage hierarchical clustering and k-means …
convex clustering is closely related to single linkage hierarchical clustering and k-means …
Learning the sampling pattern for MRI
The discovery of the theory of compressed sensing brought the realisation that many inverse
problems can be solved even when measurements are “incomplete”. This is particularly …
problems can be solved even when measurements are “incomplete”. This is particularly …
Differentiating nonsmooth solutions to parametric monotone inclusion problems
We leverage path differentiability and a recent result on nonsmooth implicit differentiation
calculus to give sufficient conditions ensuring that the solution to a monotone inclusion …
calculus to give sufficient conditions ensuring that the solution to a monotone inclusion …
Square Root LASSO: Well-Posedness, Lipschitz Stability, and the Tuning Trade-Off
This paper studies well-posedness and parameter sensitivity of the square root LASSO (SR-
LASSO), an optimization model for recovering sparse solutions to linear inverse problems in …
LASSO), an optimization model for recovering sparse solutions to linear inverse problems in …
Corrected generalized cross-validation for finite ensembles of penalized estimators
Generalized cross-validation (GCV) is a widely used method for estimating the squared out-
of-sample prediction risk that employs scalar degrees of freedom adjustment (in a …
of-sample prediction risk that employs scalar degrees of freedom adjustment (in a …
Local convergence properties of Douglas–Rachford and alternating direction method of multipliers
Abstract The Douglas–Rachford and alternating direction method of multipliers are two
proximal splitting algorithms designed to minimize the sum of two proper lower semi …
proximal splitting algorithms designed to minimize the sum of two proper lower semi …