Dynamics of Local Elasticity During Training of Neural Nets
From MaRDI portal
Publication:6381913
arXiv2111.01166MaRDI QIDQ6381913
Author name not available (Why is that?)
Publication date: 1 November 2021
Abstract: In the recent past a certain property of neural training trajectories in weight-space had been isolated, that of "local elasticity" () - which attempts to quantify the propagation of influence of a sampled data point on the prediction at another data point. In this work, we embark on a comprehensive study of local elasticity. Firstly, specific to the classification setting, we suggest a new definition of the original idea of . Via experiments on state-of-the-art neural networks training on SVHN, CIFAR-10 and CIFAR-100 we demonstrate how our new detects the property of the weight updates preferring to make changes in predictions within the same class as of the sampled data. Next, we demonstrate via examples of neural regression that the original reveals a phase behavior: that their training proceeds via an initial elastic phase when changes rapidly and an eventual inelastic phase when remains large. Lastly, we give multiple examples of learning via gradient flows for which one can get a closed-form expression of the original function. By studying the plots of these derived formulas we give theoretical demonstrations of some of the experimentally detected properties of in the regression setting.
Has companion code repository: https://github.com/avirupdas55/dynamics-of-local-elasticity-during-training-of-neural-nets
This page was built for publication: Dynamics of Local Elasticity During Training of Neural Nets
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6381913)