Size and depth of monotone neural networks: interpolation and approximation
From MaRDI portal
Publication:6404672
arXiv2207.05275MaRDI QIDQ6404672
Daniel Reichman, Dan Mikulincer
Publication date: 11 July 2022
Abstract: Monotone functions and data sets arise in a variety of applications. We study the interpolation problem for monotone data sets: The input is a monotone data set with points, and the goal is to find a size and depth efficient monotone neural network, with non negative parameters and threshold units, that interpolates the data set. We show that there are monotone data sets that cannot be interpolated by a monotone network of depth . On the other hand, we prove that for every monotone data set with points in , there exists an interpolating monotone network of depth and size . Our interpolation result implies that every monotone function over can be approximated arbitrarily well by a depth-4 monotone network, improving the previous best-known construction of depth . Finally, building on results from Boolean circuit complexity, we show that the inductive bias of having positive parameters can lead to a super-polynomial blow-up in the number of neurons when approximating monotone functions.
Has companion code repository: https://github.com/danmiku/monotonenetworks
This page was built for publication: Size and depth of monotone neural networks: interpolation and approximation
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6404672)