Stable Minima Cannot Overfit in Univariate ReLU Networks: Generalization by Large Step Sizes

D Qiao, K Zhang, E Singh, D Soudry… - arXiv preprint arXiv …, 2024 - arxiv.org
arXiv preprint arXiv:2406.06838, 2024arxiv.org
We study the generalization of two-layer ReLU neural networks in a univariate
nonparametric regression problem with noisy labels. This is a problem where kernels
(\emph {eg} NTK) are provably sub-optimal and benign overfitting does not happen, thus
disqualifying existing theory for interpolating (0-loss, global optimal) solutions. We present a
new theory of generalization for local minima that gradient descent with a constant learning
rate can\emph {stably} converge to. We show that gradient descent with a fixed learning rate …
We study the generalization of two-layer ReLU neural networks in a univariate nonparametric regression problem with noisy labels. This is a problem where kernels (\emph{e.g.} NTK) are provably sub-optimal and benign overfitting does not happen, thus disqualifying existing theory for interpolating (0-loss, global optimal) solutions. We present a new theory of generalization for local minima that gradient descent with a constant learning rate can \emph{stably} converge to. We show that gradient descent with a fixed learning rate can only find local minima that represent smooth functions with a certain weighted \emph{first order total variation} bounded by where is the label noise level, is short for mean squared error against the ground truth, and hides a logarithmic factor. Under mild assumptions, we also prove a nearly-optimal MSE bound of within the strict interior of the support of the data points. Our theoretical results are validated by extensive simulation that demonstrates large learning rate training induces sparse linear spline fits. To the best of our knowledge, we are the first to obtain generalization bound via minima stability in the non-interpolation case and the first to show ReLU NNs without regularization can achieve near-optimal rates in nonparametric regression.
arxiv.org