Show simple item record

dc.contributor.author
Li, Xiang
dc.contributor.author
Yang, Junchi
dc.contributor.author
He, Niao
dc.date.accessioned
2023-06-14T06:59:38Z
dc.date.available
2023-01-11T15:12:08Z
dc.date.available
2023-06-12T12:10:19Z
dc.date.available
2023-06-14T06:59:38Z
dc.date.issued
2022
dc.identifier.uri
http://hdl.handle.net/20.500.11850/591767
dc.description.abstract
Adaptive gradient methods have shown their ability to adjust the stepsizes on the fly in a parameter-agnostic manner, and empirically achieve faster convergence for solving minimization problems. When it comes to nonconvex minimax optimization, however, current convergence analyses of gradient descent ascent (GDA) combined with adaptive stepsizes require careful tuning of hyper-parameters and the knowledge of problem-dependent parameters. Such a discrepancy arises from the primal-dual nature of minimax problems and the necessity of delicate time-scale separation between the primal and dual updates in attaining convergence. In this work, we propose a single-loop adaptive GDA algorithm called TiAda for nonconvex minimax optimization that automatically adapts to the time-scale separation. Our algorithm is fully parameter-agnostic and can achieve near-optimal complexities simultaneously in deterministic and stochastic settings of nonconvex-strongly-concave minimax problems. The effectiveness of the proposed method is further justified numerically for a number of machine learning applications.
en_US
dc.language.iso
en
en_US
dc.publisher
OpenReview
en_US
dc.title
TiAda: A Time-scale Adaptive Algorithm for Nonconvex Minimax Optimization
en_US
dc.type
Conference Paper
ethz.book.title
OPT 2022: Optimization for Machine Learning (NeurIPS 2022 Workshop)
en_US
ethz.size
39 p.
en_US
ethz.event
14th Annual Workshop on Optimization for Machine Learning (OPT 2022)
en_US
ethz.event.location
New Orleans, LA, USA
en_US
ethz.event.date
December 3, 2022
en_US
ethz.notes
Poster presentation on December 3, 2022.
en_US
ethz.publication.place
s.l.
en_US
ethz.publication.status
published
en_US
ethz.leitzahl
ETH Zürich::00002 - ETH Zürich::00012 - Lehre und Forschung::00007 - Departemente::02150 - Dep. Informatik / Dep. of Computer Science::02661 - Institut für Maschinelles Lernen / Institute for Machine Learning::09729 - He, Niao / He, Niao
en_US
ethz.leitzahl.certified
ETH Zürich::00002 - ETH Zürich::00012 - Lehre und Forschung::00007 - Departemente::02150 - Dep. Informatik / Dep. of Computer Science::02661 - Institut für Maschinelles Lernen / Institute for Machine Learning::09729 - He, Niao / He, Niao
en_US
ethz.identifier.url
https://openreview.net/forum?id=Jk1i_UufoBX
ethz.identifier.url
https://neurips.cc/virtual/2022/57131
ethz.date.deposited
2023-01-11T15:12:09Z
ethz.source
FORM
ethz.eth
yes
en_US
ethz.availability
Metadata only
en_US
ethz.rosetta.installDate
2023-06-12T12:10:20Z
ethz.rosetta.lastUpdated
2024-02-03T00:06:05Z
ethz.rosetta.versionExported
true
ethz.COinS
ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.atitle=TiAda:%20A%20Time-scale%20Adaptive%20Algorithm%20for%20Nonconvex%20Minimax%20Optimization&rft.date=2022&rft.au=Li,%20Xiang&Yang,%20Junchi&He,%20Niao&rft.genre=proceeding&rft.btitle=OPT%202022:%20Optimization%20for%20Machine%20Learning%20(NeurIPS%202022%20Workshop)
 Search print copy at ETH Library

Files in this item

FilesSizeFormatOpen in viewer

There are no files associated with this item.

Publication type

Show simple item record