Tutorial on amortized optimization
B Amos - Foundations and Trends® in Machine Learning, 2023 - nowpublishers.com
Optimization is a ubiquitous modeling tool and is often deployed in settings which
repeatedly solve similar instances of the same problem. Amortized optimization methods …
repeatedly solve similar instances of the same problem. Amortized optimization methods …
Accelerated gradient methods for geodesically convex optimization: Tractable algorithms and convergence analysis
We propose computationally tractable accelerated first-order methods for Riemannian
optimization, extending the Nesterov accelerated gradient (NAG) method. For both …
optimization, extending the Nesterov accelerated gradient (NAG) method. For both …
A variational formulation of accelerated optimization on Riemannian manifolds
V Duruisseaux, M Leok - SIAM Journal on Mathematics of Data Science, 2022 - SIAM
It was shown recently by W. Su, S. Boyd, and E. Candes, J. Mach. Learn. Res., 17 (2016),
pp. 1--43 that Nesterov's accelerated gradient method for minimizing a smooth convex …
pp. 1--43 that Nesterov's accelerated gradient method for minimizing a smooth convex …
Accelerated optimization on Riemannian manifolds via discrete constrained variational integrators
V Duruisseaux, M Leok - Journal of Nonlinear Science, 2022 - Springer
A variational formulation for accelerated optimization on normed vector spaces was recently
introduced in Wibisono et al.(PNAS 113: E7351–E7358, 2016), and later generalized to the …
introduced in Wibisono et al.(PNAS 113: E7351–E7358, 2016), and later generalized to the …
Adaptive Hamiltonian variational integrators and applications to symplectic accelerated optimization
V Duruisseaux, J Schmitt, M Leok - SIAM Journal on Scientific Computing, 2021 - SIAM
It is well known that symplectic integrators lose their near energy preservation properties
when variable time-steps are used. The most common approach to combining adaptive time …
when variable time-steps are used. The most common approach to combining adaptive time …
Time-adaptive Lagrangian variational integrators for accelerated optimization on manifolds
V Duruisseaux, M Leok - arXiv preprint arXiv:2201.03774, 2022 - arxiv.org
A variational framework for accelerated optimization was recently introduced on normed
vector spaces and Riemannian manifolds in Wibisono et al.(2016) and Duruisseaux and …
vector spaces and Riemannian manifolds in Wibisono et al.(2016) and Duruisseaux and …
Nesterov acceleration for Riemannian optimization
In this paper, we generalize the Nesterov accelerated gradient (NAG) method to solve
Riemannian optimization problems in a computationally tractable manner. The iteration …
Riemannian optimization problems in a computationally tractable manner. The iteration …
Warped geometric information on the optimisation of Euclidean functions
We consider the fundamental task of optimizing a real-valued function defined in a
potentially high-dimensional Euclidean space, such as the loss function in many machine …
potentially high-dimensional Euclidean space, such as the loss function in many machine …