Acceleration methods
This monograph covers some recent advances in a range of acceleration techniques
frequently used in convex optimization. We first use quadratic optimization problems to …
frequently used in convex optimization. We first use quadratic optimization problems to …
An introduction to continuous optimization for imaging
A Chambolle, T Pock - Acta Numerica, 2016 - cambridge.org
A large number of imaging problems reduce to the optimization of a cost function, with
typical structural properties. The aim of this paper is to describe the state of the art in …
typical structural properties. The aim of this paper is to describe the state of the art in …
Fast optimization via inertial dynamics with closed-loop damping
In a real Hilbert space H, in order to develop fast optimization methods, we analyze the
asymptotic behavior, as time t tends to infinity, of a large class of autonomous dissipative …
asymptotic behavior, as time t tends to infinity, of a large class of autonomous dissipative …
An inertial forward-backward algorithm for monotone inclusions
In this paper, we propose an inertial forward-backward splitting algorithm to compute a zero
of the sum of two monotone operators, with one of the two operators being co-coercive. The …
of the sum of two monotone operators, with one of the two operators being co-coercive. The …
Fast convergence of inertial dynamics and algorithms with asymptotic vanishing viscosity
In a Hilbert space setting H, we study the fast convergence properties as t→+∞ of the
trajectories of the second-order differential equation x¨(t)+ α tx˙(t)+∇ Φ (x (t))= g (t), where∇ …
trajectories of the second-order differential equation x¨(t)+ α tx˙(t)+∇ Φ (x (t))= g (t), where∇ …
Convergence rates of inexact proximal-gradient methods for convex optimization
We consider the problem of optimizing the sum of a smooth convex function and a non-
smooth convex function using proximal-gradient methods, where an error is present in the …
smooth convex function using proximal-gradient methods, where an error is present in the …
The Rate of Convergence of Nesterov's Accelerated Forward-Backward Method is Actually Faster Than
H Attouch, J Peypouquet - SIAM Journal on Optimization, 2016 - SIAM
The forward-backward algorithm is a powerful tool for solving optimization problems with an
additively separable and smooth plus nonsmooth structure. In the convex setting, a simple …
additively separable and smooth plus nonsmooth structure. In the convex setting, a simple …
Efficiency of minimizing compositions of convex functions and smooth maps
D Drusvyatskiy, C Paquette - Mathematical Programming, 2019 - Springer
We consider global efficiency of algorithms for minimizing a sum of a convex function and a
composition of a Lipschitz convex function with a smooth map. The basic algorithm we rely …
composition of a Lipschitz convex function with a smooth map. The basic algorithm we rely …
Stochastic quasi-Fejér block-coordinate fixed point iterations with random sweeping
PL Combettes, JC Pesquet - SIAM Journal on Optimization, 2015 - SIAM
This work proposes block-coordinate fixed point algorithms with applications to nonlinear
analysis and optimization in Hilbert spaces. The asymptotic analysis relies on a notion of …
analysis and optimization in Hilbert spaces. The asymptotic analysis relies on a notion of …
Rate of convergence of the Nesterov accelerated gradient method in the subcritical case α≤ 3
In a Hilbert space setting ℋ, given Φ: ℋ→ ℝ a convex continuously differentiable function,
and α a positive parameter, we consider the inertial dynamic system with Asymptotic …
and α a positive parameter, we consider the inertial dynamic system with Asymptotic …