Accelerated gradient methods for geodesically convex optimization: Tractable algorithms and convergence analysis
We propose computationally tractable accelerated first-order methods for Riemannian
optimization, extending the Nesterov accelerated gradient (NAG) method. For both …
optimization, extending the Nesterov accelerated gradient (NAG) method. For both …
On quantum speedups for nonconvex optimization via quantum tunneling walks
Classical algorithms are often not effective for solving nonconvex optimization problems
where local minima are separated by high barriers. In this paper, we explore possible …
where local minima are separated by high barriers. In this paper, we explore possible …
Infeasible deterministic, stochastic, and variance-reduction algorithms for optimization under orthogonality constraints
Orthogonality constraints naturally appear in many machine learning problems, from
Principal Components Analysis to robust neural network training. They are usually solved …
Principal Components Analysis to robust neural network training. They are usually solved …
Curvature and complexity: Better lower bounds for geodesically convex optimization
C Criscitiello, N Boumal - The Thirty Sixth Annual …, 2023 - proceedings.mlr.press
We study the query complexity of geodesically convex (g-convex) optimization on a
manifold. To isolate the effect of that manifold's curvature, we primarily focus on hyperbolic …
manifold. To isolate the effect of that manifold's curvature, we primarily focus on hyperbolic …
Global Riemannian acceleration in hyperbolic and spherical spaces
D Martínez-Rubio - International Conference on Algorithmic …, 2022 - proceedings.mlr.press
We further research on the accelerated optimization phenomenon on Riemannian manifolds
by introducing accelerated global first-order methods for the optimization of $ L $-smooth …
by introducing accelerated global first-order methods for the optimization of $ L $-smooth …
Interior-point methods on manifolds: theory and applications
H Hirai, H Nieuwboer, M Walter - 2023 IEEE 64th Annual …, 2023 - ieeexplore.ieee.org
Interior-point methods offer a highly versatile framework for convex optimization that is
effective in theory and practice. A key notion in their theory is that of a self-concordant …
effective in theory and practice. A key notion in their theory is that of a self-concordant …
Accelerated riemannian optimization: Handling constraints with a prox to bound geometric penalties
D Martínez-Rubio, S Pokutta - The Thirty Sixth Annual …, 2023 - proceedings.mlr.press
We propose a globally-accelerated, first-order method for the optimization of smooth and
(strongly or not) geodesically-convex functions in a wide class of Hadamard manifolds. We …
(strongly or not) geodesically-convex functions in a wide class of Hadamard manifolds. We …
Concentration of empirical barycenters in metric spaces
VE Brunel, J Serres - International Conference on …, 2024 - proceedings.mlr.press
Barycenters (aka Fréchet means) were introduced in statistics in the 1940's and popularized
in the fields of shape statistics and, later, in optimal transport and matrix analysis. They …
in the fields of shape statistics and, later, in optimal transport and matrix analysis. They …
Riemannian accelerated gradient methods via extrapolation
In this paper, we propose a convergence acceleration scheme for general Riemannian
optimization problems by extrapolating iterates on manifolds. We show that when the …
optimization problems by extrapolating iterates on manifolds. We show that when the …
Sion's minimax theorem in geodesic metric spaces and a Riemannian extragradient algorithm
Deciding whether saddle points exist or are approximable for nonconvex-nonconcave
problems is usually intractable. This paper takes a step towards understanding a broad …
problems is usually intractable. This paper takes a step towards understanding a broad …