Accelerated gradient methods for geodesically convex optimization: Tractable algorithms and convergence analysis
We propose computationally tractable accelerated first-order methods for Riemannian
optimization, extending the Nesterov accelerated gradient (NAG) method. For both …
optimization, extending the Nesterov accelerated gradient (NAG) method. For both …
Negative curvature obstructs acceleration for strongly geodesically convex optimization, even with exact first-order oracles
C Criscitiello, N Boumal - Conference on Learning Theory, 2022 - proceedings.mlr.press
Hamilton and Moitra (2021) showed that, in certain regimes, it is not possible to accelerate
Riemannian gradient descent in the hyperbolic plane if we restrict ourselves to algorithms …
Riemannian gradient descent in the hyperbolic plane if we restrict ourselves to algorithms …
Momentum stiefel optimizer, with applications to suitably-orthogonal attention, and optimal transport
The problem of optimization on Stiefel manifold, ie, minimizing functions of (not necessarily
square) matrices that satisfy orthogonality constraints, has been extensively studied. Yet, a …
square) matrices that satisfy orthogonality constraints, has been extensively studied. Yet, a …
Negative curvature obstructs acceleration for strongly geodesically convex optimization, even with exact first-order oracles
C Criscitiello, N Boumal - arXiv preprint arXiv:2111.13263, 2021 - arxiv.org
Hamilton and Moitra (2021) showed that, in certain regimes, it is not possible to accelerate
Riemannian gradient descent in the hyperbolic plane if we restrict ourselves to algorithms …
Riemannian gradient descent in the hyperbolic plane if we restrict ourselves to algorithms …
Riemannian accelerated gradient methods via extrapolation
In this paper, we propose a convergence acceleration scheme for general Riemannian
optimization problems by extrapolating iterates on manifolds. We show that when the …
optimization problems by extrapolating iterates on manifolds. We show that when the …
Accelerated optimization on Riemannian manifolds via discrete constrained variational integrators
V Duruisseaux, M Leok - Journal of Nonlinear Science, 2022 - Springer
A variational formulation for accelerated optimization on normed vector spaces was recently
introduced in Wibisono et al.(PNAS 113: E7351–E7358, 2016), and later generalized to the …
introduced in Wibisono et al.(PNAS 113: E7351–E7358, 2016), and later generalized to the …
Adaptive Hamiltonian variational integrators and applications to symplectic accelerated optimization
V Duruisseaux, J Schmitt, M Leok - SIAM Journal on Scientific Computing, 2021 - SIAM
It is well known that symplectic integrators lose their near energy preservation properties
when variable time-steps are used. The most common approach to combining adaptive time …
when variable time-steps are used. The most common approach to combining adaptive time …
Practical perspectives on symplectic accelerated optimization
V Duruisseaux, M Leok - Optimization Methods and Software, 2023 - Taylor & Francis
Geometric numerical integration has recently been exploited to design symplectic
accelerated optimization algorithms by simulating the Bregman Lagrangian and Hamiltonian …
accelerated optimization algorithms by simulating the Bregman Lagrangian and Hamiltonian …
Time-adaptive Lagrangian variational integrators for accelerated optimization on manifolds
V Duruisseaux, M Leok - arXiv preprint arXiv:2201.03774, 2022 - arxiv.org
A variational framework for accelerated optimization was recently introduced on normed
vector spaces and Riemannian manifolds in Wibisono et al.(2016) and Duruisseaux and …
vector spaces and Riemannian manifolds in Wibisono et al.(2016) and Duruisseaux and …
Variational symplectic accelerated optimization on Lie groups
There has been significant interest in generalizations of the Nesterov accelerated gradient
descent algorithm due to its improved performance guarantee compared to the standard …
descent algorithm due to its improved performance guarantee compared to the standard …