Federated minimax optimization: Improved convergence analyses and algorithms
In this paper, we consider nonconvex minimax optimization, which is gaining prominence in
many modern machine learning applications, such as GANs. Large-scale edge-based …
many modern machine learning applications, such as GANs. Large-scale edge-based …
Stochastic gradient descent-ascent: Unified theory and new efficient methods
A Beznosikov, E Gorbunov… - International …, 2023 - proceedings.mlr.press
Abstract Stochastic Gradient Descent-Ascent (SGDA) is one of the most prominent
algorithms for solving min-max optimization and variational inequalities problems (VIP) …
algorithms for solving min-max optimization and variational inequalities problems (VIP) …
Variance reduction is an antidote to byzantines: Better rates, weaker assumptions and communication compression as a cherry on the top
Byzantine-robustness has been gaining a lot of attention due to the growth of the interest in
collaborative and federated learning. However, many fruitful directions, such as the usage of …
collaborative and federated learning. However, many fruitful directions, such as the usage of …
Communication compression for byzantine robust learning: New efficient algorithms and improved rates
A Rammal, K Gruntkowska, N Fedin… - International …, 2024 - proceedings.mlr.press
Byzantine robustness is an essential feature of algorithms for certain distributed optimization
problems, typically encountered in collaborative/federated learning. These problems are …
problems, typically encountered in collaborative/federated learning. These problems are …
Federated minimax optimization with client heterogeneity
Minimax optimization has seen a surge in interest with the advent of modern applications
such as GANs, and it is inherently more challenging than simple minimization. The difficulty …
such as GANs, and it is inherently more challenging than simple minimization. The difficulty …
Similarity, compression and local steps: three pillars of efficient communications for distributed variational inequalities
A Beznosikov, M Takác… - Advances in Neural …, 2024 - proceedings.neurips.cc
Variational inequalities are a broad and flexible class of problems that includes
minimization, saddle point, and fixed point problems as special cases. Therefore, variational …
minimization, saddle point, and fixed point problems as special cases. Therefore, variational …
Compression and data similarity: Combination of two techniques for communication-efficient solving of distributed variational inequalities
A Beznosikov, A Gasnikov - International Conference on Optimization and …, 2022 - Springer
Variational inequalities are an important tool, which includes minimization, saddles, games,
fixed-point problems. Modern large-scale and computationally expensive practical …
fixed-point problems. Modern large-scale and computationally expensive practical …
Near-Optimal Distributed Minimax Optimization under the Second-Order Similarity
This paper considers the distributed convex-concave minimax optimization under the
second-order similarity. We propose stochastic variance-reduced optimistic gradient sliding …
second-order similarity. We propose stochastic variance-reduced optimistic gradient sliding …
Distributed algorithm for solving variational inequalities over time-varying unbalanced digraphs
In this paper, we study a distributed model to cooperatively compute variational inequalities
over time-varying directed graphs. Here, each agent has access to a part of the full mapping …
over time-varying directed graphs. Here, each agent has access to a part of the full mapping …
Extragradient Sliding for Composite Non-Monotone Variational Inequalities
R Emelyanov, A Tikhomirov, A Beznosikov… - arXiv preprint arXiv …, 2024 - arxiv.org
Variational inequalities offer a versatile and straightforward approach to analyzing a broad
range of equilibrium problems in both theoretical and practical fields. In this paper, we …
range of equilibrium problems in both theoretical and practical fields. In this paper, we …