A forward-backward splitting method for monotone inclusions without cocoercivity
Y Malitsky, MK Tam - SIAM Journal on Optimization, 2020 - SIAM
In this work, we propose a simple modification of the forward-backward splitting method for
finding a zero in the sum of two monotone operators. Our method converges under the same …
finding a zero in the sum of two monotone operators. Our method converges under the same …
Accelerated Algorithms for Smooth Convex-Concave Minimax Problems with O (1/k^ 2) Rate on Squared Gradient Norm
In this work, we study the computational complexity of reducing the squared gradient
magnitude for smooth minimax optimization problems. First, we present algorithms with …
magnitude for smooth minimax optimization problems. First, we present algorithms with …
[HTML][HTML] Convergence of sequences: A survey
B Franci, S Grammatico - Annual Reviews in Control, 2022 - Elsevier
Convergent sequences of real numbers play a fundamental role in many different problems
in system theory, eg, in Lyapunov stability analysis, as well as in optimization theory and …
in system theory, eg, in Lyapunov stability analysis, as well as in optimization theory and …
Operator splitting performance estimation: Tight contraction factors and optimal parameter selection
We propose a methodology for studying the performance of common splitting methods
through semidefinite programming. We prove tightness of the methodology and demonstrate …
through semidefinite programming. We prove tightness of the methodology and demonstrate …
Solving nonconvex-nonconcave min-max problems exhibiting weak minty solutions
A Böhm - arXiv preprint arXiv:2201.12247, 2022 - arxiv.org
We investigate a structured class of nonconvex-nonconcave min-max problems exhibiting
so-called\emph {weak Minty} solutions, a notion which was only recently introduced, but is …
so-called\emph {weak Minty} solutions, a notion which was only recently introduced, but is …
Fast Optimistic Gradient Descent Ascent (OGDA) method in continuous and discrete time
In the framework of real Hilbert spaces, we study continuous in time dynamics as well as
numerical algorithms for the problem of approaching the set of zeros of a single-valued …
numerical algorithms for the problem of approaching the set of zeros of a single-valued …
Beyond the golden ratio for variational inequality algorithms
We improve the understanding of the golden ratio algorithm, which solves monotone
variational inequalities (VI) and convex-concave min-max problems via the distinctive …
variational inequalities (VI) and convex-concave min-max problems via the distinctive …
Newton-like inertial dynamics and proximal algorithms governed by maximally monotone operators
The introduction of the Hessian damping in the continuous version of Nesterov's accelerated
gradient method provides, by temporal discretization, fast proximal gradient algorithms …
gradient method provides, by temporal discretization, fast proximal gradient algorithms …
Two Steps at a Time---Taking GAN Training in Stride with Tseng's Method
Motivated by the training of generative adversarial networks (GANs), we study methods for
solving minimax problems with additional nonsmooth regularizers. We do so by employing …
solving minimax problems with additional nonsmooth regularizers. We do so by employing …
Accelerated minimax algorithms flock together
Several new accelerated methods in minimax optimization and fixed-point iterations have
recently been discovered, and, interestingly, they rely on a mechanism distinct from …
recently been discovered, and, interestingly, they rely on a mechanism distinct from …