OSQP: An operator splitting solver for quadratic programs
We present a general-purpose solver for convex quadratic programs based on the
alternating direction method of multipliers, employing a novel operator splitting technique …
alternating direction method of multipliers, employing a novel operator splitting technique …
Infeasibility detection in the alternating direction method of multipliers for convex optimization
The alternating direction method of multipliers is a powerful operator splitting technique for
solving structured optimization problems. For convex optimization problems, it is well known …
solving structured optimization problems. For convex optimization problems, it is well known …
End-to-end learning to warm-start for real-time quadratic optimization
First-order methods are widely used to solve convex quadratic programs (QPs) in real-time
appli-cations because of their low per-iteration cost. However, they can suffer from slow …
appli-cations because of their low per-iteration cost. However, they can suffer from slow …
Time-varying convex optimization via time-varying averaged operators
A Simonetto - arXiv preprint arXiv:1704.07338, 2017 - arxiv.org
Devising efficient algorithms that track the optimizers of continuously varying convex
optimization problems is key in many applications. A possible strategy is to sample the time …
optimization problems is key in many applications. A possible strategy is to sample the time …
Scaled relative graphs: Nonexpansive operators via 2D Euclidean geometry
Many iterative methods in applied mathematics can be thought of as fixed-point iterations,
and such algorithms are usually analyzed analytically, with inequalities. In this paper, we …
and such algorithms are usually analyzed analytically, with inequalities. In this paper, we …
Restart FISTA with global linear convergence
Fast Iterative Shrinking-Threshold Algorithm (FISTA) is a popular fast gradient descent
method (FGM) in the field of large scale convex optimization problems. However, it can …
method (FGM) in the field of large scale convex optimization problems. However, it can …
Tight coefficients of averaged operators via scaled relative graph
Many iterative methods in optimization are fixed-point iterations with averaged operators. As
such methods converge at an O (1/k) rate with the constant determined by the averagedness …
such methods converge at an O (1/k) rate with the constant determined by the averagedness …
Gradient based restart FISTA
Fast gradient methods (FGM) are very popular in the field of large scale convex optimization
problems. Recently, it has been shown that restart strategies can guarantee global linear …
problems. Recently, it has been shown that restart strategies can guarantee global linear …
A first-order numerical algorithm without matrix operations
This paper offers a matrix-free first-order numerical method to solve large-scale conic
optimization problems. Solving systems of linear equations pose the most computationally …
optimization problems. Solving systems of linear equations pose the most computationally …
General optimal step-size and initializations for admm: A proximal operator view
Y Ran - arXiv preprint arXiv:2309.10124, 2023 - arxiv.org
In this work, we solve a 48-year open problem by presenting the first general optimal step-
size choice for Alternating Direction Method of Multipliers (ADMM). For a convex problem …
size choice for Alternating Direction Method of Multipliers (ADMM). For a convex problem …