Acceleration methods

A d'Aspremont, D Scieur, A Taylor - Foundations and Trends® …, 2021 - nowpublishers.com
This monograph covers some recent advances in a range of acceleration techniques
frequently used in convex optimization. We first use quadratic optimization problems to …

Provably faster gradient descent via long steps

B Grimmer - SIAM Journal on Optimization, 2024 - SIAM
This work establishes new convergence guarantees for gradient descent in smooth convex
optimization via a computer-assisted analysis technique. Our theory allows nonconstant …

PEPit: computer-assisted worst-case analyses of first-order optimization methods in Python

B Goujaud, C Moucer, F Glineur, JM Hendrickx… - Mathematical …, 2024 - Springer
PEPit is a python package aiming at simplifying the access to worst-case analyses of a large
family of first-order optimization methods possibly involving gradient, projection, proximal, or …

A unified discretization framework for differential equation approach with Lyapunov arguments for convex optimization

K Ushiyama, S Sato, T Matsuo - Advances in Neural …, 2023 - proceedings.neurips.cc
The differential equation (DE) approach for convex optimization, which relates optimization
methods to specific continuous DEs with rate-revealing Lyapunov functionals, has gained …

Operator splitting performance estimation: Tight contraction factors and optimal parameter selection

EK Ryu, AB Taylor, C Bergeling, P Giselsson - SIAM Journal on Optimization, 2020 - SIAM
We propose a methodology for studying the performance of common splitting methods
through semidefinite programming. We prove tightness of the methodology and demonstrate …

From Nesterov's estimate sequence to Riemannian acceleration

K Ahn, S Sra - Conference on Learning Theory, 2020 - proceedings.mlr.press
We propose the first global accelerated gradient method for Riemannian manifolds. Toward
establishing our results, we revisit Nesterov's estimate sequence technique and develop a …

Branch-and-bound performance estimation programming: A unified methodology for constructing optimal optimization methods

S Das Gupta, BPG Van Parys, EK Ryu - Mathematical Programming, 2024 - Springer
We present the Branch-and-Bound Performance Estimation Programming (BnB-PEP), a
unified methodology for constructing optimal first-order methods for convex and nonconvex …

A systematic approach to Lyapunov analyses of continuous-time models in convex optimization

C Moucer, A Taylor, F Bach - SIAM Journal on Optimization, 2023 - SIAM
First-order methods are often analyzed via their continuous-time models, where their worst-
case convergence properties are usually approached via Lyapunov functions. In this work …

Convergence analysis of ODE models for accelerated first-order methods via positive semidefinite kernels

J Kim, I Yang - Advances in Neural Information Processing …, 2024 - proceedings.neurips.cc
We propose a novel methodology that systematically analyzes ordinary differential equation
(ODE) models for first-order optimization methods by converting the task of proving …

Stochastic first-order methods: non-asymptotic and computer-aided analyses via potential functions

A Taylor, F Bach - Conference on Learning Theory, 2019 - proceedings.mlr.press
We provide a novel computer-assisted technique for systematically analyzing first-order
methods for optimization. In contrast with previous works, the approach is particularly suited …