Skip to yearly menu bar Skip to main content


Poster

Size and depth of monotone neural networks: interpolation and approximation

Dan Mikulincer · Daniel Reichman

Hall J (level 1) #842

Keywords: [ interpolation ] [ expressivity ] [ Monotone neural networks ] [ benefit of depth ]


Abstract: Monotone functions and data sets arise in a variety of applications. We study the interpolation problem for monotone data sets: The input is a monotone data set with nn points, and the goal is to find a size and depth efficient monotone neural network with \emph{non negative parameters} and threshold units that interpolates the data set. We show that there are monotone data sets that cannot be interpolated by a monotone network of depth 22. On the other hand, we prove that for every monotone data set with nn points in RdRd, there exists an interpolating monotone network of depth 44 and size O(nd)O(nd). Our interpolation result implies that every monotone function over [0,1]d[0,1]d can be approximated arbitrarily well by a depth-4 monotone network, improving the previous best-known construction of depth d+1d+1. Finally, building on results from Boolean circuit complexity, we show that the inductive bias of having positive parameters can lead to a super-polynomial blow-up in the number of neurons when approximating monotone functions.

Chat is not available.