Riemannian SVRG: Fast stochastic optimization on Riemannian manifolds

H Zhang, SJ Reddi, S Sra - Advances in Neural …, 2016 - proceedings.neurips.cc
Advances in Neural Information Processing Systems, 2016proceedings.neurips.cc
We study optimization of finite sums of\emph {geodesically} smooth functions on
Riemannian manifolds. Although variance reduction techniques for optimizing finite-sums
have witnessed tremendous attention in the recent years, existing work is limited to vector
space problems. We introduce\emph {Riemannian SVRG}(\rsvrg), a new variance reduced
Riemannian optimization method. We analyze\rsvrg for both geodesically\emph {convex}
and\emph {nonconvex}(smooth) functions. Our analysis reveals that\rsvrg inherits …
Abstract
We study optimization of finite sums of\emph {geodesically} smooth functions on Riemannian manifolds. Although variance reduction techniques for optimizing finite-sums have witnessed tremendous attention in the recent years, existing work is limited to vector space problems. We introduce\emph {Riemannian SVRG}(\rsvrg), a new variance reduced Riemannian optimization method. We analyze\rsvrg for both geodesically\emph {convex} and\emph {nonconvex}(smooth) functions. Our analysis reveals that\rsvrg inherits advantages of the usual SVRG method, but with factors depending on curvature of the manifold that influence its convergence. To our knowledge,\rsvrg is the first\emph {provably fast} stochastic Riemannian method. Moreover, our paper presents the first non-asymptotic complexity analysis (novel even for the batch setting) for nonconvex Riemannian optimization. Our results have several implications; for instance, they offer a Riemannian perspective on variance reduced PCA, which promises a short, transparent convergence analysis.
proceedings.neurips.cc
以上显示的是最相近的搜索结果。 查看全部搜索结果