Metadata only
Date
2024-07Type
- Conference Paper
ETH Bibliography
yes
Altmetrics
Abstract
We introduce contextual stochastic bilevel optimization (CSBO) - a stochastic bilevel optimization framework with the lower-level problem minimizing an expectation conditioned on some contextual information and the upper-level decision variable. This framework extends classical stochastic bilevel optimization when the lower-level decision maker responds optimally not only to the decision of the upper-level decision maker but also to some side information and when there are multiple or even infinite many followers. It captures important applications such as meta-learning, personalized federated learning, end-to-end learning, and Wasserstein distributionally robust optimization with side information (WDRO-SI). Due to the presence of contextual information, existing single-loop methods for classical stochastic bilevel optimization are unable to converge. To overcome this challenge, we introduce an efficient double-loop gradient method based on the Multilevel Monte-Carlo (MLMC) technique and establish its sample and computational complexities. When specialized to stochastic nonconvex optimization, our method matches existing lower bounds. For meta-learning, the complexity of our method does not depend on the number of tasks. Numerical experiments further validate our theoretical results. Show more
Publication status
publishedExternal links
Book title
Advances in Neural Information Processing Systems 36Pages / Article No.
Publisher
CurranEvent
Funding
180545 - NCCR Automation (phase I) (SNF)
Notes
Poster presentation held on December 13, 2023.More
Show all metadata
ETH Bibliography
yes
Altmetrics