Metadata only
Datum
2021Typ
- Conference Paper
ETH Bibliographie
yes
Altmetrics
Abstract
We develop a new Riemannian descent algorithm that relies on momentum to improve over existing first-order methods for geodesically convex optimization. In contrast, accelerated convergence rates proved in prior work have only been shown to hold for geodesically strongly-convex objective functions. We further extend our algorithm to geodesically weakly-quasi-convex objectives. Our proofs of convergence rely on a novel estimate sequence that illustrates the dependency of the convergence rate on the curvature of the manifold. We validate our theoretical results empirically on several optimization problems defined on the sphere and on the manifold of positive definite matrices. Mehr anzeigen
Publikationsstatus
publishedExterne Links
Buchtitel
Proceedings of the 24th International Conference on Artificial Intelligence and Statistics (AISTATS 2021)Zeitschrift / Serie
Proceedings of Machine Learning ResearchBand
Seiten / Artikelnummer
Verlag
PMLRKonferenz
Organisationseinheit
09462 - Hofmann, Thomas / Hofmann, Thomas
ETH Bibliographie
yes
Altmetrics