A Continuous-time Perspective for Modeling Acceleration in Riemannian Optimization
Metadata only
Date
2020Type
- Conference Paper
ETH Bibliography
yes
Altmetrics
Abstract
We propose a novel second-order ODE as the continuous-time limit of a Riemannian accelerated gradient-based method on a manifold with curvature bounded from below. This ODE can be seen as a generalization of the ODE derived for Euclidean spaces, and can also serve as an analysis tool. We analyze the convergence behavior of this ODE for different types of functions, such as geodesically convex, strongly-convex and weakly-quasi-convex. We demonstrate how such an ODE can be discretized using a semi-implicit and Nesterov-inspired numerical integrator, that empirically yields stable algorithms which are faithful to the continuous-time analysis and exhibit accelerated convergence. Show more
Publication status
publishedExternal links
Book title
Proceedings of the 23rd International Conference on Artificial Intelligence and StatisticsJournal / series
Proceedings of Machine Learning ResearchVolume
Pages / Article No.
Publisher
PMLREvent
Organisational unit
09462 - Hofmann, Thomas / Hofmann, Thomas
Notes
Conference postponed due to Corona virus (COVID-19). Due to the Corona virus (COVID-19) the conference was conducted virtually.More
Show all metadata
ETH Bibliography
yes
Altmetrics