Stability and Deviation Optimal Risk Bounds with Convergence Rate O(1/n)
OPEN ACCESS
Loading...
Author / Producer
Date
2021
Publication Type
Conference Paper
ETH Bibliography
yes
Citations
Altmetric
OPEN ACCESS
Data
Rights / License
Abstract
The sharpest known high probability generalization bounds for uniformly stable algorithms (Feldman, Vondrak, NeurIPS 2018, COLT, 2019), (Bousquet, Klochkov, Zhivotovskiy, COLT, 2020) contain a generally inevitable sampling error term of order Θ(1/√n). When applied to excess risk bounds, this leads to suboptimal results in several standard stochastic convex optimization problems. We show that if the so-called Bernstein condition is satisfied, the term Θ(1/√n) can be avoided, and high probability excess risk bounds of order up to O(1/n) are possible via uniform stability. Using this result, we show a high probability excess risk bound with the rate O(logn/n) for strongly convex and Lipschitz losses valid for \emph{any} empirical risk minimization method. This resolves a question of Shalev-Shwartz, Shamir, Srebro, and Sridharan (COLT, 2009). We discuss how O(logn/n) high probability excess risk bounds are possible for projected gradient descent in the case of strongly convex and Lipschitz losses without the usual smoothness assumption.
Permanent link
Publication status
published
External links
Book title
Advances in Neural Information Processing Systems 34 (NeurIPS 2021)
Journal / series
Volume
31
Pages / Article No.
5065 - 5076
Publisher
Curran
Event
35th Conference on Neural Information Processing Systems (NeurIPS 2021)
Edition / version
Methods
Software
Geographic location
Date collected
Date created
Subject
Organisational unit
09679 - Bandeira, Afonso / Bandeira, Afonso