Accelerated gradient methods for geodesically convex optimization: Tractable algorithms and convergence analysis
We propose computationally tractable accelerated first-order methods for Riemannian
optimization, extending the Nesterov accelerated gradient (NAG) method. For both …
optimization, extending the Nesterov accelerated gradient (NAG) method. For both …
Lie group forced variational integrator networks for learning and control of robot systems
Incorporating prior knowledge of physics laws and structural properties of dynamical
systems into the design of deep learning architectures has proven to be a powerful …
systems into the design of deep learning architectures has proven to be a powerful …
A variational formulation of accelerated optimization on Riemannian manifolds
V Duruisseaux, M Leok - SIAM Journal on Mathematics of Data Science, 2022 - SIAM
It was shown recently by W. Su, S. Boyd, and E. Candes, J. Mach. Learn. Res., 17 (2016),
pp. 1--43 that Nesterov's accelerated gradient method for minimizing a smooth convex …
pp. 1--43 that Nesterov's accelerated gradient method for minimizing a smooth convex …
Adaptive Hamiltonian variational integrators and applications to symplectic accelerated optimization
V Duruisseaux, J Schmitt, M Leok - SIAM Journal on Scientific Computing, 2021 - SIAM
It is well known that symplectic integrators lose their near energy preservation properties
when variable time-steps are used. The most common approach to combining adaptive time …
when variable time-steps are used. The most common approach to combining adaptive time …
Generalizing adam to manifolds for efficiently training transformers
B Brantner - arXiv preprint arXiv:2305.16901, 2023 - arxiv.org
One of the primary reasons behind the success of neural networks has been the emergence
of an array of new, highly-successful optimizers, perhaps most importantly the Adam …
of an array of new, highly-successful optimizers, perhaps most importantly the Adam …
Practical perspectives on symplectic accelerated optimization
V Duruisseaux, M Leok - Optimization Methods and Software, 2023 - Taylor & Francis
Geometric numerical integration has recently been exploited to design symplectic
accelerated optimization algorithms by simulating the Bregman Lagrangian and Hamiltonian …
accelerated optimization algorithms by simulating the Bregman Lagrangian and Hamiltonian …
Time-adaptive Lagrangian variational integrators for accelerated optimization on manifolds
V Duruisseaux, M Leok - arXiv preprint arXiv:2201.03774, 2022 - arxiv.org
A variational framework for accelerated optimization was recently introduced on normed
vector spaces and Riemannian manifolds in Wibisono et al.(2016) and Duruisseaux and …
vector spaces and Riemannian manifolds in Wibisono et al.(2016) and Duruisseaux and …
Projected Neural Differential Equations for Learning Constrained Dynamics
Neural differential equations offer a powerful approach for learning dynamics from data.
However, they do not impose known constraints that should be obeyed by the learned …
However, they do not impose known constraints that should be obeyed by the learned …
A Symplectic Analysis of Alternating Mirror Descent
J Katona, X Wang, A Wibisono - arXiv preprint arXiv:2405.03472, 2024 - arxiv.org
Motivated by understanding the behavior of the Alternating Mirror Descent (AMD) algorithm
for bilinear zero-sum games, we study the discretization of continuous-time Hamiltonian flow …
for bilinear zero-sum games, we study the discretization of continuous-time Hamiltonian flow …
Nesterov acceleration for Riemannian optimization
In this paper, we generalize the Nesterov accelerated gradient (NAG) method to solve
Riemannian optimization problems in a computationally tractable manner. The iteration …
Riemannian optimization problems in a computationally tractable manner. The iteration …