On the distance between two neural networks and the stability of learning

From MaRDI portal
Publication:6334429

arXiv2002.03432MaRDI QIDQ6334429

Author name not available (Why is that?)

Publication date: 9 February 2020

Abstract: This paper relates parameter distance to gradient breakdown for a broad class of nonlinear compositional functions. The analysis leads to a new distance function called deep relative trust and a descent lemma for neural networks. Since the resulting learning rule seems to require little to no learning rate tuning, it may unlock a simpler workflow for training deeper and more complex neural networks. The Python code used in this paper is here: https://github.com/jxbz/fromage.




Has companion code repository: https://github.com/jxbz/agd








This page was built for publication: On the distance between two neural networks and the stability of learning

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6334429)