A forward-backward splitting method for monotone inclusions without cocoercivity

Y Malitsky, MK Tam - SIAM Journal on Optimization, 2020 - SIAM
In this work, we propose a simple modification of the forward-backward splitting method for
finding a zero in the sum of two monotone operators. Our method converges under the same …

Accelerated Algorithms for Smooth Convex-Concave Minimax Problems with O (1/k^ 2) Rate on Squared Gradient Norm

TH Yoon, EK Ryu - International Conference on Machine …, 2021 - proceedings.mlr.press
In this work, we study the computational complexity of reducing the squared gradient
magnitude for smooth minimax optimization problems. First, we present algorithms with …

[HTML][HTML] Convergence of sequences: A survey

B Franci, S Grammatico - Annual Reviews in Control, 2022 - Elsevier
Convergent sequences of real numbers play a fundamental role in many different problems
in system theory, eg, in Lyapunov stability analysis, as well as in optimization theory and …

Operator splitting performance estimation: Tight contraction factors and optimal parameter selection

EK Ryu, AB Taylor, C Bergeling, P Giselsson - SIAM Journal on Optimization, 2020 - SIAM
We propose a methodology for studying the performance of common splitting methods
through semidefinite programming. We prove tightness of the methodology and demonstrate …

Solving nonconvex-nonconcave min-max problems exhibiting weak minty solutions

A Böhm - arXiv preprint arXiv:2201.12247, 2022 - arxiv.org
We investigate a structured class of nonconvex-nonconcave min-max problems exhibiting
so-called\emph {weak Minty} solutions, a notion which was only recently introduced, but is …

Fast Optimistic Gradient Descent Ascent (OGDA) method in continuous and discrete time

RI Boţ, ER Csetnek, DK Nguyen - Foundations of Computational …, 2023 - Springer
In the framework of real Hilbert spaces, we study continuous in time dynamics as well as
numerical algorithms for the problem of approaching the set of zeros of a single-valued …

Beyond the golden ratio for variational inequality algorithms

A Alacaoglu, A Böhm, Y Malitsky - Journal of Machine Learning Research, 2023 - jmlr.org
We improve the understanding of the golden ratio algorithm, which solves monotone
variational inequalities (VI) and convex-concave min-max problems via the distinctive …

Newton-like inertial dynamics and proximal algorithms governed by maximally monotone operators

H Attouch, SC László - SIAM Journal on Optimization, 2020 - SIAM
The introduction of the Hessian damping in the continuous version of Nesterov's accelerated
gradient method provides, by temporal discretization, fast proximal gradient algorithms …

Two Steps at a Time---Taking GAN Training in Stride with Tseng's Method

A Bohm, M Sedlmayer, ER Csetnek, RI Bot - SIAM Journal on Mathematics of …, 2022 - SIAM
Motivated by the training of generative adversarial networks (GANs), we study methods for
solving minimax problems with additional nonsmooth regularizers. We do so by employing …

Accelerated minimax algorithms flock together

TH Yoon, EK Ryu - arXiv preprint arXiv:2205.11093, 2022 - arxiv.org
Several new accelerated methods in minimax optimization and fixed-point iterations have
recently been discovered, and, interestingly, they rely on a mechanism distinct from …