Global convergence of ADMM in nonconvex nonsmooth optimization
In this paper, we analyze the convergence of the alternating direction method of multipliers
(ADMM) for minimizing a nonconvex and possibly nonsmooth objective function, ϕ (x_0 …
(ADMM) for minimizing a nonconvex and possibly nonsmooth objective function, ϕ (x_0 …
Conic optimization via operator splitting and homogeneous self-dual embedding
We introduce a first-order method for solving very large convex cone programs. The method
uses an operator splitting method, the alternating directions method of multipliers, to solve …
uses an operator splitting method, the alternating directions method of multipliers, to solve …
On the global and linear convergence of the generalized alternating direction method of multipliers
The formulation _ x, y~ f (x)+ g (y),\quad subject to Ax+ By= b, min x, yf (x)+ g (y), subject to A
x+ B y= b, where f and g are extended-value convex functions, arises in many application …
x+ B y= b, where f and g are extended-value convex functions, arises in many application …
Arock: an algorithmic framework for asynchronous parallel coordinate updates
Finding a fixed point to a nonexpansive operator, ie, x^*=Tx^*, abstracts many problems in
numerical linear algebra, optimization, and other areas of data science. To solve fixed-point …
numerical linear algebra, optimization, and other areas of data science. To solve fixed-point …
Convergence rate analysis of several splitting schemes
Operator-splitting schemes are iterative algorithms for solving many types of numerical
problems. A lot is known about these methods: they converge, and in many cases we know …
problems. A lot is known about these methods: they converge, and in many cases we know …
Proximal splitting algorithms for convex optimization: A tour of recent advances, with new twists
Convex nonsmooth optimization problems, whose solutions live in very high dimensional
spaces, have become ubiquitous. To solve them, the class of first-order algorithms known as …
spaces, have become ubiquitous. To solve them, the class of first-order algorithms known as …
Douglas--Rachford splitting and ADMM for nonconvex optimization: Tight convergence results
A Themelis, P Patrinos - SIAM Journal on Optimization, 2020 - SIAM
Although originally designed and analyzed for convex problems, the alternating direction
method of multipliers (ADMM) and its close relatives, Douglas--Rachford splitting (DRS) and …
method of multipliers (ADMM) and its close relatives, Douglas--Rachford splitting (DRS) and …
Fedadmm: A federated primal-dual algorithm allowing partial participation
Federated learning is a framework for distributed optimization that places emphasis on
communication efficiency. In particular, it follows a client-server broadcast model and is …
communication efficiency. In particular, it follows a client-server broadcast model and is …
Faster convergence rates of relaxed Peaceman-Rachford and ADMM under regularity assumptions
In this paper, we provide a comprehensive convergence rate analysis of the Douglas-
Rachford splitting (DRS), Peaceman-Rachford splitting (PRS), and alternating direction …
Rachford splitting (DRS), Peaceman-Rachford splitting (PRS), and alternating direction …
A new primal–dual algorithm for minimizing the sum of three functions with a linear operator
M Yan - Journal of Scientific Computing, 2018 - Springer
In this paper, we propose a new primal–dual algorithm for minimizing f (x)+ g (x)+ h (A x) f
(x)+ g (x)+ h (A x), where f, g, and h are proper lower semi-continuous convex functions, f is …
(x)+ g (x)+ h (A x), where f, g, and h are proper lower semi-continuous convex functions, f is …