Neural Tangent Kernel Analysis of Deep Narrow Neural Networks

From MaRDI portal
Publication:6390336

arXiv2202.02981MaRDI QIDQ6390336

Author name not available (Why is that?)

Publication date: 7 February 2022

Abstract: The tremendous recent progress in analyzing the training dynamics of overparameterized neural networks has primarily focused on wide networks and therefore does not sufficiently address the role of depth in deep learning. In this work, we present the first trainability guarantee of infinitely deep but narrow neural networks. We study the infinite-depth limit of a multilayer perceptron (MLP) with a specific initialization and establish a trainability guarantee using the NTK theory. We then extend the analysis to an infinitely deep convolutional neural network (CNN) and perform brief experiments.




Has companion code repository: https://github.com/lthilnklover/deep-narrow-ntk








This page was built for publication: Neural Tangent Kernel Analysis of Deep Narrow Neural Networks

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6390336)