Tutorial on amortized optimization

B Amos - Foundations and Trends® in Machine Learning, 2023 - nowpublishers.com
Optimization is a ubiquitous modeling tool and is often deployed in settings which
repeatedly solve similar instances of the same problem. Amortized optimization methods …

Accelerated gradient methods for geodesically convex optimization: Tractable algorithms and convergence analysis

J Kim, I Yang - International Conference on Machine …, 2022 - proceedings.mlr.press
We propose computationally tractable accelerated first-order methods for Riemannian
optimization, extending the Nesterov accelerated gradient (NAG) method. For both …

A variational formulation of accelerated optimization on Riemannian manifolds

V Duruisseaux, M Leok - SIAM Journal on Mathematics of Data Science, 2022 - SIAM
It was shown recently by W. Su, S. Boyd, and E. Candes, J. Mach. Learn. Res., 17 (2016),
pp. 1--43 that Nesterov's accelerated gradient method for minimizing a smooth convex …

Accelerated optimization on Riemannian manifolds via discrete constrained variational integrators

V Duruisseaux, M Leok - Journal of Nonlinear Science, 2022 - Springer
A variational formulation for accelerated optimization on normed vector spaces was recently
introduced in Wibisono et al.(PNAS 113: E7351–E7358, 2016), and later generalized to the …

Adaptive Hamiltonian variational integrators and applications to symplectic accelerated optimization

V Duruisseaux, J Schmitt, M Leok - SIAM Journal on Scientific Computing, 2021 - SIAM
It is well known that symplectic integrators lose their near energy preservation properties
when variable time-steps are used. The most common approach to combining adaptive time …

Time-adaptive Lagrangian variational integrators for accelerated optimization on manifolds

V Duruisseaux, M Leok - arXiv preprint arXiv:2201.03774, 2022 - arxiv.org
A variational framework for accelerated optimization was recently introduced on normed
vector spaces and Riemannian manifolds in Wibisono et al.(2016) and Duruisseaux and …

Nesterov acceleration for Riemannian optimization

J Kim, I Yang - arXiv preprint arXiv:2202.02036, 2022 - arxiv.org
In this paper, we generalize the Nesterov accelerated gradient (NAG) method to solve
Riemannian optimization problems in a computationally tractable manner. The iteration …

Warped geometric information on the optimisation of Euclidean functions

M Hartmann, B Williams, H Yu, M Girolami… - arXiv preprint arXiv …, 2023 - arxiv.org
We consider the fundamental task of optimizing a real-valued function defined in a
potentially high-dimensional Euclidean space, such as the loss function in many machine …