Quantum gradient descent and Newton's method for constrained polynomial optimization

Open access
Author
Show all
Date
2019-07Type
- Journal Article
Citations
Cited 39 times in
Web of Science
Cited 46 times in
Scopus
ETH Bibliography
yes
Altmetrics
Abstract
Optimization problems in disciplines such as machine learning are commonly solved with iterative methods. Gradient descent algorithms find local minima by moving along the direction of steepest descent while Newton's method takes into account curvature information and thereby often improves convergence. Here, we develop quantum versions of these iterative optimization algorithms and apply them to polynomial optimization with a unit norm constraint. In each step, multiple copies of the current candidate are used to improve the candidate using quantum phase estimation, an adapted quantum state exponentiation scheme, as well as quantum matrix multiplications and inversions. The required operations perform polylogarithmically in the dimension of the solution vector and exponentially in the number of iterations. Therefore, the quantum algorithm can be useful for high-dimensional problems where a small number of iterations is sufficient. Show more
Permanent link
https://doi.org/10.3929/ethz-b-000361611Publication status
publishedExternal links
Journal / series
New Journal of PhysicsVolume
Pages / Article No.
Publisher
Institute of PhysicsSubject
quantum optimization; quantum computing; density matrix exponentiationMore
Show all metadata
Citations
Cited 39 times in
Web of Science
Cited 46 times in
Scopus
ETH Bibliography
yes
Altmetrics