Convergence of iterates for first-order optimization algorithms with inertia and Hessian driven damping

H Attouch, Z Chbani, J Fadili, H Riahi - Optimization, 2023 - Taylor & Francis
In a Hilbert space setting, for convex optimization, we show the convergence of the iterates
to optimal solutions for a class of accelerated first-order algorithms. They can be interpreted …

Continuous-time analysis of accelerated gradient methods via conservation laws in dilated coordinate systems

JJ Suh, G Roh, EK Ryu - International Conference on …, 2022 - proceedings.mlr.press
We analyze continuous-time models of accelerated gradient methods through deriving
conservation laws in dilated coordinate systems. Namely, instead of analyzing the dynamics …

Fast convergence of dynamical ADMM via time scaling of damped inertial dynamics

H Attouch, Z Chbani, J Fadili, H Riahi - Journal of Optimization Theory and …, 2022 - Springer
In this paper, we propose in a Hilbertian setting a second-order time-continuous dynamic
system with fast convergence guarantees to solve structured convex minimization problems …

[HTML][HTML] Improved convergence rates and trajectory convergence for primal-dual dynamical systems with vanishing damping

RI Boţ, DK Nguyen - Journal of Differential Equations, 2021 - Elsevier
In this work, we approach the minimization of a continuously differentiable convex function
under linear equality constraints by a second-order dynamical system with asymptotically …

Fast convex optimization via time scale and averaging of the steepest descent

H Attouch, RI Bot, DK Nguyen - arXiv preprint arXiv:2208.08260, 2022 - arxiv.org
In a Hilbert setting, we develop a gradient-based dynamic approach for fast solving convex
optimization problems. By applying time scaling, averaging, and perturbation techniques to …

[HTML][HTML] A control-theoretic perspective on optimal high-order optimization

T Lin, MI Jordan - Mathematical Programming, 2022 - Springer
We provide a control-theoretic perspective on optimal tensor algorithms for minimizing a
convex function in a finite-dimensional Euclidean space. Given a function\varPhi: R^ d → R …

Strong Convergence of Trajectories via Inertial Dynamics Combining Hessian-Driven Damping and Tikhonov Regularization for General Convex Minimizations

AC Bagy, Z Chbani, H Riahi - Numerical Functional Analysis and …, 2023 - Taylor & Francis
Let H be a real Hilbert space, and f: H→ R be a convex twice differentiable function whose
solution set argmin f is nonempty. We investigate the long time behavior of the trajectories of …

Fast convex optimization via closed-loop time scaling of gradient dynamics

H Attouch, RI Bot, DK Nguyen - arXiv preprint arXiv:2301.00701, 2023 - arxiv.org
In a Hilbert setting, for convex differentiable optimization, we develop a general framework
for adaptive accelerated gradient methods. They are based on damped inertial dynamics …

Inertial primal-dual dynamics with damping and scaling for linearly constrained convex optimization problems

X He, R Hu, YP Fang - Applicable Analysis, 2023 - Taylor & Francis
We propose an inertial primal-dual dynamic with damping and scaling coefficients, which
involves inertial terms both for primal and dual variables, for a linearly constrained convex …

Tikhonov regularization of a perturbed heavy ball system with vanishing damping

CD Alecsa, SC László - SIAM Journal on Optimization, 2021 - SIAM
This paper examines a perturbed heavy ball system with vanishing damping that contains a
Tikhonov regularization term in connection to the minimization problem of a convex Fréchet …