DC programming and DCA: thirty years of developments

HA Le Thi, T Pham Dinh - Mathematical Programming, 2018 - Springer
The year 2015 marks the 30th birthday of DC (Difference of Convex functions) programming
and DCA (DC Algorithms) which constitute the backbone of nonconvex programming and …

Minimization of for Compressed Sensing

P Yin, Y Lou, Q He, J Xin - SIAM Journal on Scientific Computing, 2015 - SIAM
We study minimization of the difference of \ell_1 and \ell_2 norms as a nonconvex and
Lipschitz continuous metric for solving constrained and unconstrained compressed sensing …

Open issues and recent advances in DC programming and DCA

HA Le Thi, T Pham Dinh - Journal of Global Optimization, 2024 - Springer
DC (difference of convex functions) programming and DC algorithm (DCA) are powerful
tools for nonsmooth nonconvex optimization. This field was created in 1985 by Pham Dinh …

Fast L1–L2 minimization via a proximal operator

Y Lou, M Yan - Journal of Scientific Computing, 2018 - Springer
This paper aims to develop new and fast algorithms for recovering a sparse vector from a
small number of measurements, which is a fundamental problem in the field of compressive …

A weighted difference of anisotropic and isotropic total variation model for image processing

Y Lou, T Zeng, S Osher, J Xin - SIAM Journal on Imaging Sciences, 2015 - SIAM
We propose a weighted difference of anisotropic and isotropic total variation (TV) as a
regularization for image processing tasks, based on the well-known TV model and natural …

A survey for sparse regularization based compression methods

A Tang, P Quan, L Niu, Y Shi - Annals of Data Science, 2022 - Springer
In recent years, deep neural networks (DNNs) have attracted extensive attention due to their
excellent performance in many fields of vision and speech recognition. With the increasing …

Transformed ℓ1 regularization for learning sparse deep neural networks

R Ma, J Miao, L Niu, P Zhang - Neural Networks, 2019 - Elsevier
Abstract Deep Neural Networks (DNNs) have achieved extraordinary success in numerous
areas. However, DNNs often carry a large number of weight parameters, leading to the …

A scale-invariant approach for sparse signal recovery

Y Rahimi, C Wang, H Dong, Y Lou - SIAM Journal on Scientific Computing, 2019 - SIAM
In this paper, we study the ratio of the L_1 and L_2 norms, denoted as L_1/L_2, to promote
sparsity. Due to the nonconvexity and nonlinearity, there has been little attention to this scale …

Minimization of transformed penalty: theory, difference of convex function algorithm, and robust application in compressed sensing

S Zhang, J Xin - Mathematical Programming, 2018 - Springer
We study the minimization problem of a non-convex sparsity promoting penalty function, the
transformed l_1 l 1 (TL1), and its application in compressed sensing (CS). The TL1 penalty …

Hyperspectral Image Restoration via Global L1-2 Spatial–Spectral Total Variation Regularized Local Low-Rank Tensor Recovery

H Zeng, X Xie, H Cui, H Yin… - IEEE transactions on …, 2020 - ieeexplore.ieee.org
Hyperspectral images (HSIs) are usually corrupted by various noises, eg, Gaussian noise,
impulse noise, stripes, dead lines, and many others. In this article, motivated by the good …