Accelerated gradient methods for geodesically convex optimization: Tractable algorithms and convergence analysis
We propose computationally tractable accelerated first-order methods for Riemannian
optimization, extending the Nesterov accelerated gradient (NAG) method. For both …
optimization, extending the Nesterov accelerated gradient (NAG) method. For both …
First-order algorithms for min-max optimization in geodesic metric spaces
From optimal transport to robust dimensionality reduction, many machine learning
applicationscan be cast into the min-max optimization problems over Riemannian manifolds …
applicationscan be cast into the min-max optimization problems over Riemannian manifolds …
Curvature and complexity: Better lower bounds for geodesically convex optimization
C Criscitiello, N Boumal - The Thirty Sixth Annual …, 2023 - proceedings.mlr.press
We study the query complexity of geodesically convex (g-convex) optimization on a
manifold. To isolate the effect of that manifold's curvature, we primarily focus on hyperbolic …
manifold. To isolate the effect of that manifold's curvature, we primarily focus on hyperbolic …
Dissolving constraints for Riemannian optimization
In this paper, we consider optimization problems over closed embedded submanifolds of R
n, which are defined by the constraints c (x)= 0. We propose a class of constraint-dissolving …
n, which are defined by the constraints c (x)= 0. We propose a class of constraint-dissolving …
Curvature-independent last-iterate convergence for games on riemannian manifolds
Numerous applications in machine learning and data analytics can be formulated as
equilibrium computation over Riemannian manifolds. Despite the extensive investigation of …
equilibrium computation over Riemannian manifolds. Despite the extensive investigation of …
Block coordinate descent on smooth manifolds: Convergence theory and twenty-one examples
Block coordinate descent is an optimization paradigm that iteratively updates one block of
variables at a time, making it quite amenable to big data applications due to its scalability …
variables at a time, making it quite amenable to big data applications due to its scalability …
Global Riemannian acceleration in hyperbolic and spherical spaces
D Martínez-Rubio - International Conference on Algorithmic …, 2022 - proceedings.mlr.press
We further research on the accelerated optimization phenomenon on Riemannian manifolds
by introducing accelerated global first-order methods for the optimization of $ L $-smooth …
by introducing accelerated global first-order methods for the optimization of $ L $-smooth …
Accelerated riemannian optimization: Handling constraints with a prox to bound geometric penalties
D Martínez-Rubio, S Pokutta - The Thirty Sixth Annual …, 2023 - proceedings.mlr.press
We propose a globally-accelerated, first-order method for the optimization of smooth and
(strongly or not) geodesically-convex functions in a wide class of Hadamard manifolds. We …
(strongly or not) geodesically-convex functions in a wide class of Hadamard manifolds. We …
Geodesically convex -estimation in metric spaces
VE Brunel - The Thirty Sixth Annual Conference on Learning …, 2023 - proceedings.mlr.press
We study the asymptotic properties of geodesically convex $ M $-estimation on non-linear
spaces. Namely, we prove that under very minimal assumptions besides geodesic convexity …
spaces. Namely, we prove that under very minimal assumptions besides geodesic convexity …
Riemannian accelerated gradient methods via extrapolation
In this paper, we propose a convergence acceleration scheme for general Riemannian
optimization problems by extrapolating iterates on manifolds. We show that when the …
optimization problems by extrapolating iterates on manifolds. We show that when the …