Fast Rates for Noisy Interpolation Require Rethinking the Effects of Inductive Bias
METADATA ONLY
Loading...
Author / Producer
Date
2022
Publication Type
Conference Paper
ETH Bibliography
yes
Citations
Altmetric
METADATA ONLY
Data
Rights / License
Abstract
Good generalization performance on high-dimensional data crucially hinges on a simple structure of the ground truth and a corresponding strong inductive bias of the estimator. Even though this intuition is valid for regularized models, in this paper we caution against a strong inductive bias for interpolation in the presence of noise: While a stronger inductive bias encourages a simpler structure that is more aligned with the ground truth, it also increases the detrimental effect of noise. Specifically, for both linear regression and classification with a sparse ground truth, we prove that minimum lp-norm and maximum lp-margin interpolators achieve fast polynomial rates close to order 1/n for p > 1 compared to a logarithmic rate for p = 1. Finally, we provide preliminary experimental evidence that this trade-off may also play a crucial role in understanding non-linear interpolating models used in practice.
Permanent link
Publication status
published
Book title
Proceedings of the 39th International Conference on Machine Learning
Journal / series
Volume
162
Pages / Article No.
5397 - 5428
Publisher
PMLR
Event
39th International Conference on Machine Learning (ICML 2022)
Edition / version
Methods
Software
Geographic location
Date collected
Date created
Subject
Organisational unit
09652 - Yang, Fan / Yang, Fan
02219 - ETH AI Center / ETH AI Center