Acceleration methods
This monograph covers some recent advances in a range of acceleration techniques
frequently used in convex optimization. We first use quadratic optimization problems to …
frequently used in convex optimization. We first use quadratic optimization problems to …
Provably faster gradient descent via long steps
B Grimmer - SIAM Journal on Optimization, 2024 - SIAM
This work establishes new convergence guarantees for gradient descent in smooth convex
optimization via a computer-assisted analysis technique. Our theory allows nonconstant …
optimization via a computer-assisted analysis technique. Our theory allows nonconstant …
PEPit: computer-assisted worst-case analyses of first-order optimization methods in Python
PEPit is a python package aiming at simplifying the access to worst-case analyses of a large
family of first-order optimization methods possibly involving gradient, projection, proximal, or …
family of first-order optimization methods possibly involving gradient, projection, proximal, or …
A unified discretization framework for differential equation approach with Lyapunov arguments for convex optimization
The differential equation (DE) approach for convex optimization, which relates optimization
methods to specific continuous DEs with rate-revealing Lyapunov functionals, has gained …
methods to specific continuous DEs with rate-revealing Lyapunov functionals, has gained …
Operator splitting performance estimation: Tight contraction factors and optimal parameter selection
We propose a methodology for studying the performance of common splitting methods
through semidefinite programming. We prove tightness of the methodology and demonstrate …
through semidefinite programming. We prove tightness of the methodology and demonstrate …
From Nesterov's estimate sequence to Riemannian acceleration
We propose the first global accelerated gradient method for Riemannian manifolds. Toward
establishing our results, we revisit Nesterov's estimate sequence technique and develop a …
establishing our results, we revisit Nesterov's estimate sequence technique and develop a …
Branch-and-bound performance estimation programming: A unified methodology for constructing optimal optimization methods
We present the Branch-and-Bound Performance Estimation Programming (BnB-PEP), a
unified methodology for constructing optimal first-order methods for convex and nonconvex …
unified methodology for constructing optimal first-order methods for convex and nonconvex …
A systematic approach to Lyapunov analyses of continuous-time models in convex optimization
First-order methods are often analyzed via their continuous-time models, where their worst-
case convergence properties are usually approached via Lyapunov functions. In this work …
case convergence properties are usually approached via Lyapunov functions. In this work …
Convergence analysis of ODE models for accelerated first-order methods via positive semidefinite kernels
We propose a novel methodology that systematically analyzes ordinary differential equation
(ODE) models for first-order optimization methods by converting the task of proving …
(ODE) models for first-order optimization methods by converting the task of proving …
Stochastic first-order methods: non-asymptotic and computer-aided analyses via potential functions
We provide a novel computer-assisted technique for systematically analyzing first-order
methods for optimization. In contrast with previous works, the approach is particularly suited …
methods for optimization. In contrast with previous works, the approach is particularly suited …