Noise Stability Optimization for Flat Minima with Tight Rates

From MaRDI portal
Publication:6440298

arXiv2306.08553MaRDI QIDQ6440298

Author name not available (Why is that?)

Publication date: 14 June 2023

Abstract: We consider finding flat, local minimizers by adding average weight perturbations. Given a nonconvex function f:mathbbRdightarrowmathbbR and a d-dimensional distribution mathcalP which is symmetric at zero, we perturb the weight of f and define F(W)=mathbbE[f(W+U)], where U is a random sample from mathcalP. This injection induces regularization through the Hessian trace of f for small, isotropic Gaussian perturbations. Thus, the weight-perturbed function biases to minimizers with low Hessian trace. Several prior works have studied settings related to this weight-perturbed function by designing algorithms to improve generalization. Still, convergence rates are not known for finding minima under the average perturbations of the function F. This paper considers an SGD-like algorithm that injects random noise before computing gradients while leveraging the symmetry of mathcalP to reduce variance. We then provide a rigorous analysis, showing matching upper and lower bounds of our algorithm for finding an approximate first-order stationary point of F when the gradient of f is Lipschitz-continuous. We empirically validate our algorithm for several image classification tasks with various architectures. Compared to sharpness-aware minimization, we note a 12.6% and 7.8% drop in the Hessian trace and top eigenvalue of the found minima, respectively, averaged over eight datasets. Ablation studies validate the benefit of the design of our algorithm.




Has companion code repository: https://github.com/virtuosoresearch/noise-stability-optimization








This page was built for publication: Noise Stability Optimization for Flat Minima with Tight Rates

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6440298)