Fast optimization via inertial dynamics with closed-loop damping
In a real Hilbert space H, in order to develop fast optimization methods, we analyze the
asymptotic behavior, as time t tends to infinity, of a large class of autonomous dissipative …
asymptotic behavior, as time t tends to infinity, of a large class of autonomous dissipative …
From the Ravine method to the Nesterov method and vice versa: a dynamical system perspective
We revisit the Ravine method of Gelfand and Tsetlin from a dynamical system perspective,
study its convergence properties, and highlight its similarities and differences with the …
study its convergence properties, and highlight its similarities and differences with the …
Fast convex optimization via time scale and averaging of the steepest descent
H Attouch, R Ioan Boţ… - Mathematics of Operations …, 2024 - pubsonline.informs.org
In a Hilbert setting, we develop a gradient-based dynamic approach for fast solving convex
optimization problems. By applying time scaling, averaging, and perturbation techniques to …
optimization problems. By applying time scaling, averaging, and perturbation techniques to …
Inertial proximal gradient methods with Bregman regularization for a class of nonconvex optimization problems
This paper proposes an inertial Bregman proximal gradient method for minimizing the sum
of two possibly nonconvex functions. This method includes two different inertial steps and …
of two possibly nonconvex functions. This method includes two different inertial steps and …
Solving convex optimization problems via a second order dynamical system with implicit Hessian damping and Tikhonov regularization
SC László - Computational Optimization and Applications, 2024 - Springer
This paper deals with a second order dynamical system with a Tikhonov regularization term
in connection to the minimization problem of a convex Fréchet differentiable function. The …
in connection to the minimization problem of a convex Fréchet differentiable function. The …
Gradient Norm Minimization of Nesterov Acceleration:
In the history of first-order algorithms, Nesterov's accelerated gradient descent (NAG) is one
of the milestones. However, the cause of the acceleration has been a mystery for a long …
of the milestones. However, the cause of the acceleration has been a mystery for a long …
On the effect of perturbations in first-order optimization methods with inertia and Hessian driven damping
Second-order continuous-time dissipative dynamical systems with viscous and Hessian
driven damping have inspired effective first-order algorithms for solving convex optimization …
driven damping have inspired effective first-order algorithms for solving convex optimization …
On the strong convergence of the trajectories of a Tikhonov regularized second order dynamical system with asymptotically vanishing damping
SC László - Journal of Differential Equations, 2023 - Elsevier
This paper deals with a second order dynamical system with vanishing damping that
contains a Tikhonov regularization term, in connection to the minimization problem of a …
contains a Tikhonov regularization term, in connection to the minimization problem of a …
An SDE perspective on stochastic inertial gradient dynamics with time-dependent viscosity and geometric damping
Our approach is part of the close link between continuous dissipative dynamical systems
and optimization algorithms. We aim to solve convex minimization problems by means of …
and optimization algorithms. We aim to solve convex minimization problems by means of …
Fast convex optimization via closed-loop time scaling of gradient dynamics
In a Hilbert setting, for convex differentiable optimization, we develop a general framework
for adaptive accelerated gradient methods. They are based on damped inertial dynamics …
for adaptive accelerated gradient methods. They are based on damped inertial dynamics …