Acceleration methods

A d'Aspremont, D Scieur, A Taylor - Foundations and Trends® …, 2021 - nowpublishers.com
This monograph covers some recent advances in a range of acceleration techniques
frequently used in convex optimization. We first use quadratic optimization problems to …

An introduction to continuous optimization for imaging

A Chambolle, T Pock - Acta Numerica, 2016 - cambridge.org
A large number of imaging problems reduce to the optimization of a cost function, with
typical structural properties. The aim of this paper is to describe the state of the art in …

Fast optimization via inertial dynamics with closed-loop damping

H Attouch, RI Boţ, ER Csetnek - Journal of the European Mathematical …, 2022 - ems.press
In a real Hilbert space H, in order to develop fast optimization methods, we analyze the
asymptotic behavior, as time t tends to infinity, of a large class of autonomous dissipative …

An inertial forward-backward algorithm for monotone inclusions

DA Lorenz, T Pock - Journal of Mathematical Imaging and Vision, 2015 - Springer
In this paper, we propose an inertial forward-backward splitting algorithm to compute a zero
of the sum of two monotone operators, with one of the two operators being co-coercive. The …

Fast convergence of inertial dynamics and algorithms with asymptotic vanishing viscosity

H Attouch, Z Chbani, J Peypouquet… - Mathematical Programming, 2018 - Springer
In a Hilbert space setting H, we study the fast convergence properties as t→+∞ of the
trajectories of the second-order differential equation x¨(t)+ α tx˙(t)+∇ Φ (x (t))= g (t), where∇ …

Convergence rates of inexact proximal-gradient methods for convex optimization

M Schmidt, N Roux, F Bach - Advances in neural …, 2011 - proceedings.neurips.cc
We consider the problem of optimizing the sum of a smooth convex function and a non-
smooth convex function using proximal-gradient methods, where an error is present in the …

The Rate of Convergence of Nesterov's Accelerated Forward-Backward Method is Actually Faster Than

H Attouch, J Peypouquet - SIAM Journal on Optimization, 2016 - SIAM
The forward-backward algorithm is a powerful tool for solving optimization problems with an
additively separable and smooth plus nonsmooth structure. In the convex setting, a simple …

Efficiency of minimizing compositions of convex functions and smooth maps

D Drusvyatskiy, C Paquette - Mathematical Programming, 2019 - Springer
We consider global efficiency of algorithms for minimizing a sum of a convex function and a
composition of a Lipschitz convex function with a smooth map. The basic algorithm we rely …

Stochastic quasi-Fejér block-coordinate fixed point iterations with random sweeping

PL Combettes, JC Pesquet - SIAM Journal on Optimization, 2015 - SIAM
This work proposes block-coordinate fixed point algorithms with applications to nonlinear
analysis and optimization in Hilbert spaces. The asymptotic analysis relies on a notion of …

Rate of convergence of the Nesterov accelerated gradient method in the subcritical case α≤ 3

H Attouch, Z Chbani, H Riahi - ESAIM: Control, Optimisation and …, 2019 - esaim-cocv.org
In a Hilbert space setting ℋ, given Φ: ℋ→ ℝ a convex continuously differentiable function,
and α a positive parameter, we consider the inertial dynamic system with Asymptotic …